Thursday, March 29, 2012

How to Overcome Google's Over Optimization Filter


Few weeks ago, Matt Cutts, The head of Google's anti-spam team, announced that Google is working on a penalty for over optimized or best termed overly SEO'ed websites. Hmmmmm, So now what does this announcement actually mean and how it will affect your website and how should you overcome this Google ranking filter?

Well Here's what Matt Cutts said in detail:
Matt Cutts"We don’t normally pre-announce changes but there is something we are working in the last few months and hope to release it in the next months or few weeks. We are trying to level the playing field a bit.
All those people doing, for lack of a better word, over-optimization or overly SEO – versus those making great content and great site. We are trying to make GoogleBot smarter, make our relevance better, and we are also looking for those who abuse it, like too many keywords on a page, or exchange way too many links or go well beyond what you normally expect. We have several engineers on my team working on this right now."
WOW! - Now that's a new challenge for all spammers out there

Does this mean that you shouldn't optimize your website anymore? - No! not at all.
This announcement is meant for people who are trying to go beyond the limits. Of course, it is perfectly okay to add your keywords on your webpage and it is also perfectly okay to exchange links with other websites. Actually, all of these methods can have a very positive influence on the Google rankings of your website. 
But But But! The problem starts when these elements are overused. Yeah that's right. This usually happens when you use fully automated tools and services to promote your website. Now when I say fully automated tools, you know what tools I am talking about! 
risky SEO methods
For example, it is likely that the following SEO methods (Grey Hat/Black Hat) will get your website in trouble:
  • automated link exchange networks that add your link to hundreds on other websites. Means - Affiliate networks who post your links to their affiliate networks using tools may be considered spam!
  • tools that automatically create fake forum accounts and comments with a link to your site. Means - You know what it says. Forum links, profiles links, blog comment links created using semi or full automated tools.
  • tools that automatically create keyword rich web pages for you on your website. Means - No installation of tools or plugins which extract content based on provide keywords and put it on your webpage.
  • participating in paid link schemes. Means - No links from page or a portion of a page which says "Paid links" "Sponsored Links" etc.... Remember Google hates paid links.

Fully or Semi automated solutions always mean spam when it comes to SEO
Google loves to show high quality web pages with unique content in the search results. Web pages created automatically DO NOT have the quality as a web page created by a human. 
Believe me fully or semi automated solutions or whatever you call leads to low quality links and content that is not at all useful to the most of the web surfers. Firms that "optimized" their websites with such methods are the ones who will getting filtered by the new Google ranking filter. 
If you come across the words "fully automated" or "hundreds" or "thousands" back Links or content in one page then you can be sure that the advertised product or service is spamming and your website will get in trouble if you promote it with these methods.
safe SEO methods

Don't panic and create a symbiotic relationship with search engines
If you haven't used the spam methods described above to promote your website, then your website and your Google rankings won't be in trouble.
Optimizing your web pages is always very important because you have to show Google and other search engines that your website is relevant to a particular targeted keyword and that your website has a high quality content.The key is to focus on the user and not search engines.
Secure your Google rankings by following white hat SEO methods or hire an SEO Outsourcing Company which follows search engine guidelines.

All ways try to optimizing your web pages in a better way but don't forget to play by the rules. If you optimize it correctly, then your Search Engine rankings will last long resulting in more customer and more sales.

If you're pushing the limits too far and trying to play tricks, your website will get in trouble sooner or later. If you want to get long lasting search engine results then only use white hat SEO methods.



Wednesday, March 21, 2012

How to judge the quality of a backlink?

You want lasting results when you build links to your website. It does not make sense to invest your time in search engine optimization methods that are just a flash-in-the-pan.

How do you judge the quality of a website? What is a good website and from which web pages should you get links?

1. The PageRank of a website is not that important Many webmasters only want to get backlinks from pages with a particular PageRank. While you can use this method, it is usually a waste of time and it makes link building more difficult than it really is.

If a website has an overall high quality then it does not matter if the page with the link to your website has a low Google PageRank:

  • If a high quality website adds a new page, the new page will have an initial PageRank of zero. Nevertheless, the page can still be very good. 
  •  A page that has a PageRank of zero today can have a high PageRank tomorrow.

If only pages with a high PageRank had a chance, it wouldn’t be possible to get new pages in Google’s result page. Experience shows that new pages appear in Google’s results every day.

In addition, the PageRank that Google publicly displays is not the actual PageRank that Google uses in its algorithm and the PageRank value can be manipulated.

2. Common sense leads to lasting results You do not need special metrics to judge the quality of a web page. When you find a web page that could link to your site, ask yourself the following questions:


  •  Does the linking page look good to the average web surfer?
  • Does the page have interesting content? Is the content somewhat related to my website?
  • Does it make sense if the web page links to your site?
If you can answer all questions with “yes” then you should try to get a backlink from that page. It doesn’t matter if that page has a low PageRank.

3. Give Google what they want and your website will succeed Google tries to imitate common sense with its algorithm. If you use common sense to build your links and follow the tips above, you make sure that the backlinks to your website will count in all future updates of Google’s algorithm.


Source

Thursday, March 8, 2012

Google Advice for all Free host - Keep it spam free

Free web hosting services can be great! Many of these services have helped to lower costs and technical barriers for webmasters and they continue to enable beginner webmasters to start their adventure on the web. Unfortunately, sometimes these lower barriers (meant to encourage less techy audiences) can attract some dodgy characters like spammers who look for cheap and easy ways to set up dozens or hundreds of sites that add little or no value to the web. When it comes to automatically generated sites, our stance remains the same: if the sites do not add sufficient value, we generally consider them as spam and take appropriate steps to protect our users from exposure to such sites in our natural search results.


We consider automatically generated sites like this one to be spammy.

If a free hosting service begins to show patterns of spam, we make a strong effort to be granular and tackle only spammy pages or sites. However, in some cases, when the spammers have pretty much taken over the free web hosting service or a large fraction of the service, we may be forced to take more decisive steps to protect our users and remove the entire free web hosting service from our search results. To prevent this from happening, we would like to help owners of free web hosting services by sharing what we think may help you save valuable resources like bandwidth and processing power, and also protect your hosting service from these spammers:

Publish a clear abuse policy and communicate it to your users, for example during the sign-up process. This step will contribute to transparency on what you consider to be spammy activity.

In your sign-up form, consider using CAPTCHAs or similar verification tools to only allow human submissions and prevent automated scripts from generating a bunch of sites on your hosting service. While these methods may not be 100% foolproof, they can help to keep a lot of the bad actors out.

Try to monitor your free hosting service for other spam signals like redirections, large numbers of ad blocks, certain spammy keywords, large sections of escaped JavaScript code, etc. Using the site: operator query or Google Alerts may come in handy if you’re looking for a simple, cost efficient solution.

Keep a record of signups and try to identify typical spam patterns like form completion time, number of requests sent from the same IP address range, user-agents used during signup, user names or other form-submitted values chosen during signup, etc. Again, these may not always be conclusive.

Keep an eye on your webserver log files for sudden traffic spikes, especially when a newly-created site is receiving this traffic, and try to identify why you are spending more bandwidth and processing power.

Try to monitor your free web hosting service for phishing and malware-infected pages. For example, you can use the Google Safe Browsing API to regularly test URLs from your service, or sign up to receive alerts for your AS.

Come up with a few sanity checks. For example, if you’re running a local Polish free web hosting service, what are the odds of thousands of new and legitimate sites in Japanese being created overnight on your service? There’s a number of tools you may find useful for language detection of newly created sites, for example language detection libraries or the Google Translate API v2.

Last but not least, if you run a free web hosting service be sure to monitor your services for sudden activity spikes that may indicate a spam attack in progress.

For more tips on running a quality hosting service, have a look at our previous post. Lastly, be sure to sign up and verify your site in Google Webmaster Tools so we may be able to notify you when needed or if we see issues.

Later today, Google's Matt Cutts posted a video on this topic:

http://youtu.be/oNKxl_VHRFg

Source

Wednesday, March 7, 2012

7 characteristics of how Google evaluates backlinks



What exactly did Google announce?

Google announced several changes of the ranking algorithm. The most important change was the change of the backlink analysis:

"Link evaluation. We often use characteristics of links to help us figure out the topic of a linked page. We have changed the way in which we evaluate links; in particular, we are turning off a method of link analysis that we used for several years. We often rearchitect or turn off parts of our scoring in order to keep our system maintainable, clean and understandable."

What does this mean for your web page rankings?

Unfortunately, Google doesn't go into detail. As mentioned in Google's statement, Google uses "characteristics of links" to figure out the topic of a linked page.

We'll take a look at the characteristics below. Then we'll try to find out which element could have been changed.

Characteristic 1: the text of the link

The anchor text of a link has been the most important factor for a long time. If many sites link to a page with the anchor text "green apple" then the page will get high rankings for the term "green apple".

Characteristic 2: the link power of the linking page

Links from pages with many inbound links have a higher influence than links from pages with few backlinks.

Characteristic 3: content and page proximity

If you sell shoes on your website then the link to your website will have a bigger impact if the text that surrounds the link to your page is about shoes. If the link to your website is surrounded by totally unrelated text then the link won't count as much.

Characteristic 4: link attributes such as nofollow and title

Links that use the rel=nofollow attribute don't affect the position of your web pages on Google. Some webmasters think that using title attributes can have a positive effect if these contain the targeted keywords. Others think that this could trigger a spam filters.

Characteristic 5: redirects and shortened URLs

Redirects and shortened URLs such as http://bit.ly/a4X4th are URLs that redirect to another URL. Should these URLs carry full weight when it comes to calculate the position of a website?

Characteristic 6: the age of a website

The older a link is, the more Google tends to trust that link. Links that remain for a long time seem to have a bigger impact than links that come and go.

Characteristic 7: the affiliation of the linking sites

Links from websites that have the same owner and links from affiliates may have a different influence on the rankings of a web page than links from websites that are not affiliated with the page.

Of course, we can only speculate what Google has changed. It is likely that Google now only considers the anchor text of a link if that link is placed in the right context (page and content proximity). Another possibility is that affiliate links (redirected or with page variables) will have less influence on the rankings of a page.

Just continue to get good backlinks and optimize the content of your pages. We provide Professional SEO Services for all kind off large, medium and Small Business SEO requirements.

Source

Thursday, March 1, 2012

Which Factor Did Google Turn Off For Link Evaluation?

As part of some of the 40 Google changes announced last night, Google said they made a change to how they evaluate links.

Google wrote:
Link evaluation. We often use characteristics of links to help us figure out the topic of a linked page. We have changed the way in which we evaluate links; in particular, we are turning off a method of link analysis that we used for several years. We often rearchitect or turn off parts of our scoring in order to keep our system maintainable, clean and understandable.

As you can read, Google decided to turn off one of the many analysis methods they use for their link evaluation. It was a method or factor they used for "several years" and decided to turn it off in the past 30 days.

A WebmasterWorld thread is asking, which method do you think they turned off?

WebmasterWorld link building moderator gave his ideas:

Title Tag
The title tag is a signal of what a web page is generally about. The title tag has been used to help identify the meaning of a linked page from the general theme of the page linking to it. The title tag is a signal of the general theme of a web page. The topic of a link can vary from a side topic to a more granular or a completely off topic meaning.

Surrounding Text
This establishes the context of a link, thus helping to define what the linked web page is about. This also helps identify if a link is paid for or is associated with a donation.

Position of a link
Where a link is located is an important signal. A link in the footer is presumed to be less important than a link within the body of a web page. Navigational links are presumed to be depreciated according to a set amount, perhaps more than outbound links but just a fraction of a normal non-depreciated link.

HTML signals
These include Heading Tags [w3.org], bold, italics, capitalization and font size [webmasterworld.com]. Font size is an interesting candidate for deprecation.

Get your website SEO'ed best by our SEO Services India and be one of the panda's favorite site.

40 More Google Search Quality Updates From Google



Here is Google's full list:

More coverage for related searches. [launch codename “Fuzhou”] This launch brings in a new data source to help generate the “Searches related to” section, increasing coverage significantly so the feature will appear for more queries. This section contains search queries that can help you refine what you’re searching for.

Tweak to categorizer for expanded sitelinks. [launch codename “Snippy”, project codename “Megasitelinks”] This improvement adjusts a signal we use to try and identify duplicate snippets. We were applying a categorizer that wasn’t performing well for our expanded sitelinks, so we’ve stopped applying the categorizer in those cases. The result is more relevant sitelinks.

Less duplication in expanded sitelinks. [launch codename “thanksgiving”, project codename “Megasitelinks”] We’ve adjusted signals to reduce duplication in the snippets for expanded sitelinks. Now we generate relevant snippets based more on the page content and less on the query.

More consistent thumbnail sizes on results page. We’ve adjusted the thumbnail size for most image content appearing on the results page, providing a more consistent experience across result types, and also across mobile and tablet. The new sizes apply to rich snippet results for recipes and applications, movie posters, shopping results, book results, news results and more.

More locally relevant predictions in YouTube. [project codename “Suggest”] We’ve improved the ranking for predictions in YouTube to provide more locally relevant queries. For example, for the query [lady gaga in ] performed on the US version of YouTube, we might predict [lady gaga in times square], but for the same search performed on the Indian version of YouTube, we might predict [lady gaga in India].

More accurate detection of official pages. [launch codename “WRE”] We’ve made an adjustment to how we detect official pages to make more accurate identifications. The result is that many pages that were previously misidentified as official will no longer be.

Refreshed per-URL country information. [Launch codename “longdew”, project codename “country-id data refresh”] We updated the country associations for URLs to use more recent data.

Expand the size of our images index in Universal Search. [launch codename “terra”, project codename “Images Universal”] We launched a change to expand the corpus of results for which we show images in Universal Search. This is especially helpful to give more relevant images on a larger set of searches.

Minor tuning of autocomplete policy algorithms. [project codename “Suggest”] We have a narrow set of policies for autocomplete for offensive and inappropriate terms. This improvement continues to refine the algorithms we use to implement these policies.

“Site:” query update [launch codename “Semicolon”, project codename “Dice”] This change improves the ranking for queries using the “site:” operator by increasing the diversity of results.

Improved detection for SafeSearch in Image Search. [launch codename "Michandro", project codename “SafeSearch”] This change improves our signals for detecting adult content in Image Search, aligning the signals more closely with the signals we use for our other search results.

Interval based history tracking for indexing. [project codename “Intervals”] This improvement changes the signals we use in document tracking algorithms.

Improvements to foreign language synonyms. [launch codename “floating context synonyms”, project codename “Synonyms”] This change applies an improvement we previously launched for English to all other languages. The net impact is that you’ll more often find relevant pages that include synonyms for your query terms.

Disabling two old fresh query classifiers. [launch codename “Mango”, project codename “Freshness”] As search evolves and new signals and classifiers are applied to rank search results, sometimes old algorithms get outdated. This improvement disables two old classifiers related to query freshness.

More organized search results for Google Korea. [launch codename “smoothieking”, project codename “Sokoban4”] This significant improvement to search in Korea better organizes the search results into sections for news, blogs and homepages.

Fresher images. [launch codename “tumeric”] We’ve adjusted our signals for surfacing fresh images. Now we can more often surface fresh images when they appear on the web.

Update to the Google bar. [project codename “Kennedy”] We continue to iterate in our efforts to deliver a beautifully simple experience across Google products, and as part of that this month we made further adjustments to the Google bar. The biggest change is that we’ve replaced the drop-down Google menu in the November redesign with a consistent and expanded set of links running across the top of the page.

Adding three new languages to classifier related to error pages. [launch codename "PNI", project codename "Soft404"] We have signals designed to detect crypto 404 pages (also known as “soft 404s”), pages that return valid text to a browser but the text only contain error messages, such as “Page not found.” It’s rare that a user will be looking for such a page, so it’s important we be able to detect them. This change extends a particular classifier to Portuguese, Dutch and Italian.

Improvements to travel-related searches. [launch codename “nesehorn”] We’ve made improvements to triggering for a variety of flight-related search queries. These changes improve the user experience for our Flight Search feature with users getting more accurate flight results.

• Data refresh for related searches signal. [launch codename “Chicago”, project codename “Related Search”] One of the many signals we look at to generate the “Searches related to” section is the queries users type in succession. If users very often search for [apple] right after [banana], that’s a sign the two might be related. This update refreshes the model we use to generate these refinements, leading to more relevant queries to try.

International launch of shopping rich snippets. [project codename “rich snippets”] Shopping rich snippets help you more quickly identify which sites are likely to have the most relevant product for your needs, highlighting product prices, availability, ratings and review counts. This month we expanded shopping rich snippets globally (they were previously only available in the US, Japan and Germany).

Improvements to Korean spelling. This launch improves spelling corrections when the user performs a Korean query in the wrong keyboard mode (also known as an "IME", or input method editor). Specifically, this change helps users who mistakenly enter Hangul queries in Latin mode or vice-versa.

Improvements to freshness. [launch codename “iotfreshweb”, project codename “Freshness”] We’ve applied new signals which help us surface fresh content in our results even more quickly than before.

Web History in 20 new countries. With Web History, you can browse and search over your search history and webpages you've visited. You will also get personalized search results that are more relevant to you, based on what you’ve searched for and which sites you’ve visited in the past. In order to deliver more relevant and personalized search results, we’ve launched Web History in Malaysia, Pakistan, Philippines, Morocco, Belarus, Kazakhstan, Estonia, Kuwait, Iraq, Sri Lanka, Tunisia, Nigeria, Lebanon, Luxembourg, Bosnia and Herzegowina, Azerbaijan, Jamaica, Trinidad and Tobago, Republic of Moldova, and Ghana. Web History is turned on only for people who have a Google Account and previously enabled Web History.

Improved snippets for video channels. Some search results are links to channels with many different videos, whether on mtv.com, Hulu or YouTube. We’ve had a feature for a while now that displays snippets for these results including direct links to the videos in the channel, and this improvement increases quality and expands coverage of these rich “decorated” snippets. We’ve also made some improvements to our backends used to generate the snippets.

Improvements to ranking for local search results. [launch codename “Venice”] This improvement improves the triggering of Local Universal results by relying more on the ranking of our main search results as a signal.

• Improvements to English spell correction. [launch codename “Kamehameha”] This change improves spelling correction quality in English, especially for rare queries, by making one of our scoring functions more accurate.

Improvements to coverage of News Universal. [launch codename “final destination”] We’ve fixed a bug that caused News Universal results not to appear in cases when our testing indicates they’d be very useful.

Consolidation of signals for spiking topics. [launch codename “news deserving score”, project codename “Freshness”] We use a number of signals to detect when a new topic is spiking in popularity. This change consolidates some of the signals so we can rely on signals we can compute in realtime, rather than signals that need to be processed offline. This eliminates redundancy in our systems and helps to ensure we can continue to detect spiking topics as quickly as possible.

Better triggering for Turkish weather search feature. [launch codename “hava”] We’ve tuned the signals we use to decide when to present Turkish users with the weather search feature. The result is that we’re able to provide our users with the weather forecast right on the results page with more frequency and accuracy.

Visual refresh to account settings page. We completed a visual refresh of the account settings page, making the page more consistent with the rest of our constantly evolving design.

Panda update. This launch refreshes data in the Panda system, making it more accurate and more sensitive to recent changes on the web.

Link evaluation. We often use characteristics of links to help us figure out the topic of a linked page. We have changed the way in which we evaluate links; in particular, we are turning off a method of link analysis that we used for several years. We often rearchitect or turn off parts of our scoring in order to keep our system maintainable, clean and understandable.

SafeSearch update. We have updated how we deal with adult content, making it more accurate and robust. Now, irrelevant adult content is less likely to show up for many queries.

Spam update. In the process of investigating some potential spam, we found and fixed some weaknesses in our spam protections.

Improved local results. We launched a new system to find results from a user’s city more reliably. Now we’re better able to detect when both queries and documents are local to the user.

Baidu vs. Google in China


Google’s approach to China, which the company says it is still focused on despite the fact that it relocated its search engine to Hong Kong in 2010. Of all of the challenges that Google faces there, the fierce competition provided by Chinese rival Baidu is one of the biggest.

Baiduis simply killing Google in its core market, search. Yet outside of China little is known of the company, which s very much like Google but with key differences.

Were you aware, for example, that Baidu has more than 50 communities and services online, including maps, an encyclopedia, a mobile OS and more?
Did you realise that Google briefly held a 2 percent share of the Chinese search giant before it entered China itself?
Do you know that Baidu prioritises paid search terms above organic results without any limitation on the number that appear?
We’re usually skeptical on infographics but these stats, and more, are brought to us in a neat visual comparison from Digimind, which asks: ‘Is the battle already lost for Google?’

Looking at their prime focus, Baidu is streets ahead of Google in search, controlling 83.6 percent of all searches in China against Google’s 11.1 percent.

However, when you look at search in terms of revenue, the picture isn’t so shabby. Google accounts for 18 percent of the estimated $866 million (5.5 billion yuan) industry-wide annual revenue. That means that the US firm is making more than $155 million from search alone in China.

While its search battle may be lost, as Baidu is unlikely to be caught, we’d say that the rest of the ‘war’ is very much on for Google. The company has plenty of other irons in the fire to keep it busy and appealing to China’s 500 million plus Internet users and its massive mobile market.

Google is in a far better position in China than many other Web companies, like Facebook or Twitter, which see their popular Western services blocked in the country. For those firms, the battle in China is truly lost…for now, at least.