SEM

Shankar:- SEO Tips and Tricks for 2014 : Latest and Updated

Over the past years, SEO has been changing constantly. Google has taken it very seriously that the users get the right content for the queries they search for. And for that, numerous updates have been made in their algorithms from the past year.

These are the updates Google has made in its algorithms in the year 2013.

  • Panda #24 – January 22, 2013
  • Panda#25 – March 14, 2013
  • Phantom – May 9, 2013
  • Domain Crowding – May 21, 2013
  • Penguin 2.0 – May 22, 2013
  • PayDay Loan Update – June 11, 2013
  • Panda Dance – June 11, 2103
  • Multi-week Update – June 22, 2013
  • Panda Recovery – July 18, 2013
  • Knowledge Graph Expanision – July 19, 2013
  • Humming Bird – August 20, 2013
  • Penguin 2.1 – October 4, 2013
  • Authorship Shake-Up – December 19, 2013

Now, let’s see what updates Google has made in the year 2014, till date.

  • Page Layout #2 – February 6, 2014
  • Unnamed update – March 24, 2014
  • PayDay Loan 2.0 – March 16, 2014
  • Panda 4.0 – May 19, 2014.
There is still half an year to go and experts are expecting a lot more updates to come from Google. Here’s the list of all Google’s SEO algorithms since 2000 and their updates. More on Google Panda and Google Penguin Updates. Check out all the Google algorithm updates in its history.
NOTE : if you are new to these technical terms like panda and penguin – then go through the following links
  1. What is Google Panda Update
  2. What is Google Penguin Update
  3. Difference between the Panda and Penguin Updates

Here are a few SEO Tips and Tricks for 2014 that will help you go.

1 # : I observed one interesting thing in this 2014 that – even if it a smaller or larger site – We Can Rank (this is for small websites owners :) ) ! Previously this is not the situation and small websites owners never tried big fearing bigger sites about their authority in Google. And I have seen seen so many positive results which I cannot share and this will be a good factor.
2 # : Please don’t go after backlinks : Better concentrate creating more quality content and dont waste too much in getting backlinks (either money or time).
3 # : STOP DOING guest posts to obtain links. Google already made several announcements about this.
4 # : TRY to engage the viewers (with commenting and feedback) and also make sure that you make PERFECT navigation in the site to make sure that users/visitors can navigate through the content without any complexity.
5 # : Google Panda 4.0 Hit Sites Blocking CSS and JavaScript – this is a news from Seroundtable.com where they told that some people reported them – So try avoiding this in case if you are penalized in this recent Google Panda Update. Here is one live example how a person named Joost (developer) recovered his site from this similar issue. Check this link.
Here are some more essential things that one should follow to maintain the Rankings

#1 Have a Responsive Web Design

There has been a drastic growth in the use of smartphones and tablets, in the past few years. And most of those users prefer browsing the internet through those devices. Make sure that the content of your website is easily readable when viewed through all other devices like such; in short, improve your website’s responsive design. This is a very important factor that can affect your SEO since, if your website doesn’t provide a good user interface, the user will shift to something that does.

#2 Use Social Media

Publicizing your article on Social Media is a very important technique to, not only get more hits, but also to improve your SEO. Google Plus’s +1 carries around 0.37% of the weight-age of your page’s SEO. So, the better you reach out to people on Social media, the better it is for your website. Still there is lot of

#3 Brand building – Authorship

Although Google has recently announced that they will remove the authorship images from search results, SEO experts, after many experiments, say that authorship can still affect a website in a positive manner.
However, only the image is dropped off and authorship stays in the website. To set up authorship you will need a Google+ account. Still today, no one from the SEO industry could confirm the importance of the authorship verified content in search rankings either in the form of some SEO experiments or cases, even google never confirmed it – but still there is a big question about its factor.
seo_tips_and_tricks_for_2014_authorship
This is how authorship mark up will look in the search results.

 #4 Go with a Content Management System (CMS)

Use a proper content management system that offers various plugins. WordPress is one of the most used content management systems, nowadays. It not only provides you with amazing plugins, but will also help you frame your content and images in a way which will make it easier for Google bots to traverse. Although, using a CMS is not a must, it is highly advisable. Blogspot and Drupal are a couple of other famous CMS.

#5 Have a killer UX

Good User Experience (UX) can lead you to great results. See that the navigation from one page to another inside the site, is easy, even for a normal user to do. Also, see that the loading time  of the website is minimal, not only on a PC, but also on other devices. Learn more about UX and make the most out of it.

#6 Get a good Domain Name

Domain name? you may ask. Yes. See that the domain name is relevant to the content you are planning to produce in the site. Having the main base word/keyword of the website in your domain name can turn out to be very helpful. Choose a classy name and not a funny/silly name. Generally, such silly named websites are not taken seriously.
Finally I would say its just a factor and its  not compulsory to make a domain name keyword related to your niche – because Branding is also an alternative for that. If you have one then its GOOD because I still see lot of niche sites ranking for the keywords and the search engine still recognizes it even after many panda and penguin updates.

#7 Keyword Analysis

Look for keywords that have low competition and high search volume. This will help you write articles using such words as the keywords and gain more users. Google Keyword Planner is a free SEO tool that will help you find such words. Instead of using a single word, it is advised to use key phrases. And also, make sure that you know how to use a keyword in the context. However, the most important of things is to produce good quality content that your readers would connect to.

#8 Link Building

See that you first build proper inter links among the pages of your website so that it gets easier for the user to navigate. It is fine to use keyword in the links but make sure that you link it only to a relevant page on your website. This can help you improve the rank of other webpages in your website.
Always use relevant, high quality links that go out of your website. This builds a trust to Google that what you are referring to is important and useful. And this might eventually push your website to higher position in the search results. And, keep away from bad and low quality links.  Then again, there are few tiny basic mistakes that newbies tend to do.

Things that one should NOT do for SEO

#1 Stuffing content with keywords.

This was an old trick and is lethal, if you do it now. See that they keyword density is in between 10-15%of your content and not more than it. And also, include they keyword only if it naturally goes with the sentence. Do not stuff them everywhere you like because that will only pull you down and down and down.

#2 Stealing content.

This is the worst possible thing one can do. We write to share our views to our readers and not do a Ctrl+C and Ctrl+V. Although most don’t do this, some get misguided into the trap.

#3 Building unnatural links.

You must’ve heard that the more the number of links, the better it is. And someone might have even suggested you to get links from automated bots. No! Do not blindly build links bearing an impression that it will get your page higher in the list. The opposite, exactly, will happen.

#4  Writing large amounts of useless content.

Although content is the king, it doesn’t mean that you write more and more useless, irrelevant content just to increase the number of blog posts. Remember, the lower the quality of content, the lower will be the rank of the page. Write content that is useful and relevant to the topic.
Stealing content also includes the “rewriting articles” and Google is smart in identifying the stuff even if copyscape.com misses it ! Know about the Google panda which takes care of spam and useless content. Useful I mean = New and Unique content.
These are few important tips to follow and to not follow in 2014, to make the best of your SEO. Hope these will help you with your website.
Here are few more SEO Tips that I covered in 2013 which are still valid and can help you. Although it is dated back 2013, there are useful things that have not changed. Give it a look.

Increasing Your Analytics Productivity With UI Improvements

We’re always working on making Analytics easier for you to use. Since launching the latest version of Google Analytics (v5), we’ve been collecting qualitative and quantitative feedback from our users in order to improve the experience. Below is a summary of the latest updates. Some you may already be using, but all will be available shortly if you’re not seeing them yet. 


Make your dashboards better with new widgets and layout options



Use maps, devices and bar chart widgets in order to create a perfectly tailored dashboard for your audience. Get creative with these and produce, share and export custom dashboards that look exactly how you want with the metrics that matter to you. We have also introduced improvements to customize the layout of your dashboards to better suit individual needs. In addition dashboards now support advanced segments!
Get to your most frequently used reports quicker

You’ll notice we’ve made the sidebar of Google Analytics even more user-friendly, including quick access to your all-important shortcuts:


If you’re not already creating Shortcuts, read more about them and get started today. We have also enabled shortcuts for real-time reports, which allows you to set up a specific region to see its traffic in real-time, for example.
Navigate to recently used reports and profiles quicker with Recent History


Ever browse around Analytics and want to go back to a previous report? Instead of digging for the report, we’ve made it even simpler when you use Recent History.
Improving search functionality



Better Search allows you to search across all reports, shortcuts and dashboards all at once to find what you need.
Keyboard shortcuts

In case you've never seen them, Google Analytics does have some keyboard shortcuts. Be sure you’re using them to move around faster. Here are a few useful ones:

Search: s , / (Access to the quick search list)
Account List: Shift + a (access to the quick account list)
Set date range: d + t (set the date range to today)
On screen guide: Shift + ? (view the complete list of shortcuts)
Easier YoY Date Comparison


The new quick selection option lets you select previous year to prefill date range improving your productivity to conduct year over year analysis.
Export to Excel & Google Docs 

Exporting keeps getting better, and now includes native Excel XSLX support and Google Docs:


We hope you find these improvements useful and always feel free to let us know how we can make Analytics even more usable for you to get the information you need to take action faster.

Guidelines for Breadcrumb Usability And SEO

 What is Breadcrumb?
Breadcrumbs or breadcrumb trail is a navigation aid used in user interfaces. It allows users to keep track of their locations within programs or documents. The term comes from the trail of breadcrumbs left by Hansel and Gretel in the popular fairytale.


Importance of Breadcrumbs For Usability

There are various pro-breadcrumb arguments by Nielsen and other usability experts such as the fact that breadcrumbs:

  • Help users visualize their current location in relation to the rest of the web site.
  • Enable one-click access to higher site levels and thus help users who enter a site through search or deep links
  • Take up very little space on the page.
  • Reduce the bounce rate. In fact, breadcrumbs can be an excellent way to entice first-time visitors to continue browsing through a web site after having viewed the landing page.
  • Enable users to reach pages and complete tasks faster

Guidelines for Breadcrumb Usability And SEO
Whilst there exist several UI techniques that are used to render breadcrumbs, there are generally-agreed upon usability and SEO guidelines for breadcrumbs. Breadcrumbs should:


  • Never replace primary navigation. They have been devised as a secondary navigation aid and should always be used as such.
  • Not be used if all the pages are on the same level. Breadcrumbs are intended to show hierarchy.
  • Show hierarchy and not history. To go back, users use the browser’s back button. Replicating this facility defies the purpose of having breadcrumbs.
  • Be located in the top half of your web page. It can be placed above everything on the top of the page, just below the main navigation bar or just above the headline of the current page.
  • Not be too large. The breadcrumb trail is a secondary navigation aid and hence its size and prominence should be less than that of the primary navigation.
  • Progress from the highest level to the lowest, one step at a time.
  • Start with the homepage and end with the current page.
  • Have a simple link for each level (except for the current page). If the trail includes non clickable elements such as page titles then include them but clearly differentiate which parts are clickable.
  • Have a simple, one-character separator between each level (usually “>”).
  • Not be cluttered with unnecessary text such as “You are here” or “Navigation”.
  • Include the full navigational path from the homepage to the current page. Not displaying certain levels will confuse users.
  • Include the full page title in the breadcrumb trail. Also ensure consistency between the page address and the breadcrumb. If the page titles include keywords, then this will make your breadcrumbs both human and search engine friendly.

Document Search Engines to Search for SEO Documentation

A while ago I listed a few ways to find PDF tutorials, and later also shared my collection of SEO cheat sheets which showed that SEJ readers were definitely interested in SEO documentation. So today I am adding a few document search engines to add to your arsenal.
Brupt is a Google custom search engine that searches for .doc, .pdf, Excel and Power Point documents (.doc files by default). The search results interface looks like Google’s. You can choose a different file extension to search right from SERPs.
brupt 3 Document Search Engines to Search for SEO Documentation
Voelspriet offers to search for .doc, .pdf, Excel and Power Point, RTF, txt, WRI, PS, BAT files. Results will open in anew tab as Google advanced filetype: search:
voelspriet 3 Document Search Engines to Search for SEO Documentation
DocJax is a more fun tool. It has search while-you-type suggestions and is powered by Google and Yahoo. It also allows to search all four (.doc, .pdf, Excel and Power Point) file types simultaneously or separately and gives a preview link. The site also has a community and if you are willing to, you can join and save your favorite documents in your account area.
docjax 3 Document Search Engines to Search for SEO Documentation

SEO & Website Redesign: Relaunching Without Losing Sleep

Redesigns can make an ugly site pretty, but they can also make a high traffic site invisible. Keep these tips and no-nos in mind and you can keep yourself out of the CEO’s office.
SEO Redesign: Teamwork First
It should go without saying, but SEOs, developers and designers must work together cohesively during the site redesign process.
Too often, companies look to refresh the look of their site, and in the end, destroy their search engine presence. How? This can come from a myriad of reasons from coding errors, SEO unfriendly design practices, to even more disastrous practices (e.g., content duplication, URL rewriting without redirection, information architecture changes away from search engine friendly techniques).
Starting the redesign process with a collaborative call between the SEO team, designer, developer, and company decision maker(s) is always the best first step.
Often there are two attitudes present. Either, “We are redesigning our site and are not open to your ideas…but don’t let us do anything wrong,” or the other attitude (and my favorite), “Let’s work together to achieve a refreshed look and functionality and instill any missing SEO opportunities if possible.” 
To satisfy both scenarios, your information delivery as the SEO should be to inform designers and developers of the mistakes you shouldn’t make and also to announce to all parties what SEO revisions should be made to the site along with what search engines have recently been paying attention to.
Page Load Time 
A site redesign gives you the opportunity to re-code, condense externally referenced files, and achieve faster load times.
Don’t let the designer use the word “Flash” during your call(s). In an attempt to make a new site look pretty, the reliance on multimedia usage can have a negative effect on site speed. Ignoring this is bad, as Google has stated in the last year that site speed is a ranking consideration – also, slower sites annoy users.
Content Duplication
Ensure that your development environment or beta sections of the site are excluded from search engine’s view. Relaunching your site when these elements have been indexed by the engines means your cool new site is a duplicate and you will be in a mad dash trying to redirect the development environment that was leaked. Also, make sure there are no live copies on other servers that have visibility with the search engines.
Another form of content duplication is the creation of new URLs without properly redirecting old URLs via a 301 permanent redirect. This will leave search engines wondering which page should be ranked.
It's also worth mentioning that 301s are a must and that 302 temporary redirects should not be used. Make it commonplace in the redesign process that no one used the word delete in reference to site content. You should never delete any pages, these should be permanently redirected to the most relevant launching page.  
Content Restrictions 
It’s important before you throw the site to the web that you make sure that you have identified what pages shouldn't be crawled. 
Are there new parts of the site that shouldn’t be seen by search engines, login pages, etc.? Does the new site utilize dynamic URL creation or parameters that will need to be restricted?
Inversely, what pages might be restricted that shouldn’t be? Is there a folder in the robots.txt file that is inaccurately excluding pages that should be visible? Have meta robots tags been placed on pages that shouldn’t have been tags?
Tracking
Make sure that your analytical tracking code is placed back in the page source before the site goes live. Additionally, any conversion pages should have the appropriate conversion tracking code appended. Nothing makes an SEO want to cry like lost data.
Information Architecture
A redesign is the perfect time to rethink the direction of the site. Go beyond the need for a refreshed look and analyze the hierarchy of your content. Google is looking at this so be sure there is a clear view of the overall site theme as well as sub-themes flowing into the site through an appropriate folder structure.
URL Rewrite
If you're redesigning and shaking a site down to its core, there's no better time than now. You have the attention and devotion of the site developer to make your URLs right.
This is a continuation of the Information Architecture revisions. Be mindful of folder structure as well as relevant, keyword-rich text usage in page names.
Want to go the extra mile? Have the filename extensions removed so down the road if you redesign the site again and use a different script language you won’t have to do another URL rewrite.
Lastly, make sure all rewritten URLs include a 301 permanent redirect from the old URL to the new URL.
W3C/Section 508/Code Validation
Take advantage of this period to address code issues and how your site adheres to W3C and Section 508 compliance factors. Search engines want to see your excellence here and now is your chance to make their visit successful as well as your human site visitors.
Usability
Can you make the intended funnel of visit shorter or easier? This is great time for you think about what you want visitors to do. You may be able to remove a step in the purchase/goal funnel and increase your site’s convertibility.
Benchmarking
To truly assess the success of the redesign from an SEO and sales standpoint, ensure that you have recorded several site statistics as well as focused monitoring in post-launch. You will be happy you did because it will either be a visible success story or a lifesaver for finding problems once the site launches.
These include:
·         Run a ranking report.
·         Check your pages indexed in Google and Bing.
·         Run a page load time test.
·         Perform a W3C code validation report.
·         Note the bounce rate, time on site, pages per visits, and goal completions. Granted, this can be reviewed in analytics after launch, but be mindful that you should be watching this.
·         Run a site spider crawl of the live site to get a good list of URLs on the current site. You may need this for any clean of missed redirects.
·         Note the average time for Google to download a page and average pages crawled per visit in Google Webmaster tools. Also, “fetch as Googlebot” so you have a previous copy of what Google used to see.
Conclusion
Taking into account all of the mistakes you or the others on the redesign team shouldn’t be making will ultimately leave you much less stressed after the site launches. Meanwhile, minding all the opportunities that a redesign presents from an SEO and usability standpoint can lead to a successful launch and a fruitful post-launch environment.

Now get out there and show them how it’s done!

4 Ways to Find Out Why Your Website Traffic Died After a Relaunch

A site redesign and relaunch can be an exciting and busy time in the life of a company’s web marketing program. It's a great time to shake a site down to its core,revamp the messagelook and feel, and – most importantly – structure the site for SEO success (assuming you read my article onhow to relaunch without losing sleep).
On the other hand, if done improperly, a relaunch or site update can have disastrous consequences. For those who anxiously await the increased traffic and conversions from the updated site, there are those who are often greeted with tanking traffic post-launch.
Frantically assessing the site to find out what's gone wrong and why can be the most nerve-wracking part of a post-launch failure. Below is a quick assessment to diagnose post-launch issues.

1. Check Google Analytics

Has all site traffic ceased? If so, maybe analytical tracking didn't make it to the new site. Check this manually.
If you are receiving organic traffic, just at a reduced rate, run the site through Analytics Checkup. It could be that a certain section of the site, such as the blog, doesn't have proper tracking code placement. The scrape of all pages for tracking placement will identify issues.

2. Check robots.txt

If analytics passed inspection, now you know something is wrong. The first consideration is deindexation.
Check the robots.txt file for “disallow: /” or in the head of page source code for a meta robots tag exclaiming noindex. If your site is typically crawled very frequently this can do damage very quickly and start killing rankings. If your site doesn't enjoy frequent crawling, then this culprit can take days to a week before killing your online presence.

3. A Deeper Check of Google Analytics

OK, all the factors above are fine, where to next? It's time to review Google Analytics again, but go deeper.

Page Names Changed During the Relaunch

Was this URL rewrite architected well so that old URLs are 301 redirected to new pages? Review the organic traffic by landing page for those with the largest loss with a date range of the week prior to launch. Have those landing pages showing as top performers last week been redirected to new URLs?
(Note: You can also analyze Google Webmaster Tools for 404 error pages. However, it can take days for this information to appear, and we don’t have that much time.)
Next, move to the Content section and the sub-category of All Pages in Google Analytics. Choose the Primary Dimension of All Pages while also choosing a date range of post-launch.
Now, knowing the text rendered in the title element of your 404 page, filter search this text and see how many pageviews on the site are rendering 404 pages. Furthermore, open a secondary dimension of Landing Page to find these 404ing pages.
When you redirected pages, did you do a simple bulk redirecting of pages to the homepage or a site section or detailed one-to-one redirects. The latter is the preferable choice as redirected ranking listing may now have no thematic correlation with their respective search term and thus be washed away from ranking for the given term.

Page Names Didn't Change During the Relaunch

Once again, look at organic traffic by Landing Page. Look at post-launch vs. a comparable time pre-launch.
You still see the drop, but now open a secondary dimension by Keyword. Make an assessment of the keyword losses paired to their respective landing pages.
Review the current landing page vs. the pre-launch landing page. Have the suffering keywords in question disappeared from the focus/theme of the page?
Assuming you ran a keyword ranking report before launch, run one again and see if there are noticeable ranking drops already. Again review pre-launch and post-launch pages as done above a moment ago for the keyword theme differences.

4. Check for Host or Server Issues

Analytics are fine, there are no de-indexation issues, all redirects (if applicable) are fine, and all keyword focus per page is fine. What gives?
Did you change hosting or server? Communication issues between visitors, the host and server can lead to a delay in content delivery or in fact a timing out of content. This leaves a search engine with no way to view the page.
You can review this in Google Webmaster Tools in Crawl Errors and assess DNS errors and Server Connectivity. This may take days to show too and time is something we don’t have.
Run the site through Pingdom’s DNS Health and the Ping/Traceroute tool. This will help identify potential content delivery and server communication issues that may exist.

Finding Resolution

While there may be alternative methods for finding post-launch issues, following the tips above should help you quickly run through your site to pinpoint your traffic's cause of death.
If everything above checked out OK for you and you still don’t know why you're experiencing a grave organic search exposure loss, then you may have a less common issue that requires a deeper dive. Perhaps there is a design/code flaw or flagrant over-optimization.

ranking-factors-2013

2013 Search Engine Ranking Factors

Correlations

To compute the correlations, we followed the same process as in 2011. We started with a large set of keywords from Google AdWords (14,000+ this year) that spanned a wide range of search volumes across all topic categories. Then, we collected the top 50 organic search results from Google-US in a depersonalized way. All SERPs were collected in early June, after the Penguin 2.0 update.
For each search result, we extracted all the factors we wanted to analyze and finally computed the mean Spearman correlation across the entire data set. Except for some of the details that I will discuss below, this is the same general process that both Searchmetrics and Netmark recently used in their excellent studies. Jerry Feng and Mike O'Leary on the Data Science team at Moz worked hard to extract many of these features (thank you!):
When interpreting the correlation results, it is important to remember that correlation does not prove causation.
Rand has a nice blog post explaining the importance of this type of analysis and how to interpret these studies. As we review the results below, I will call out the places with a high correlation that may not indicate causation.

Enough of the boring methodology, I want the data!

Here's the first set, Mozscape link correlations:
Correlations: Page level
Correlations: Domain level
Page Authority is a machine learning model inside our Mozscape index that predicts ranking ability from links and it is the highest correlated factor in our study. As in 2011, metrics that capture the diversity of link sources (C-blocks, IPs, domains) also have high correlations. At the domain/sub-domain level, sub-domain correlations are larger then domain correlations.
In the survey, SEOs also thought links were very important:
Survey: Links

Anchor text

Over the past two years, we've seen Google crack down on over-optimized anchor text. Despite this, anchor text correlations for both partial and exact match were also quite large in our data set:
Interestingly, the surveyed SEOs thought that an organic anchor text distribution (a good mix of branded and non-branded) is more important then the number of links:
The anchor text correlations are one of the most significant differences between our results and the Searchmetrics study. We aren't sure exactly why this is the case, but suspect it is because we included navigational queries while Searchmetrics removed them from its data. Many navigational queries are branded, and will organically have a lot of anchor text matching branded search terms, so this may account for the difference.

On-page

Are keywords still important on-page?
We measured the relationship between the keyword and the document both with the TF-IDF score and the language model score and found that the title tag, the body of the HTML, the meta description and the H1 tags all had relatively high correlation:
Correlations: On-page
See my blog post on relevance vs. ranking for a deep dive into these numbers (but note that this earlier post uses a older version of the data, so the correlation numbers are slightly different).
SEOs also agreed that the keyword in the title and on the page were important factors:
Survey: On-page
We also computed some additional on-page correlations to check whether structured markup (schema.org or Google+ author/publisher) had any relationship to rankings. All of these correlations are close to zero, so we conclude that they are not used as ranking signals (yet!).

Exact/partial match domain

The ranking ability of exact and partial match domains (EMD/PMD) has been heavily debated by SEOs recently, and it appears Google is still adjusting their ranking ability (e.g. this recent post by Dr. Pete). In our data collected in early June (before the June 25 update), we found EMD correlations to be relatively high at 0.17 (0.20 if the EMD is also a dot-com), just about on par with the value from our 2011 study:
This was surprising, given the MozCast data that shows EMD percentage is decreasing, so we decided to dig in. Indeed, we do see that the EMD percent has decreased over the last year or so (blue line):
However, we see a see-saw pattern in the EMD correlations (red line) where they decreased last fall, then rose back again in the last few months. We attribute the decrease last fall to Google's EMD update (as announced by Matt Cutts). The increase in correlations between March and June says that the EMDs that are still present are ranking higher overall in the SERPs, even though they are less prevalent. Could this be Google removing lower quality EMDs?
Netmark recently calculated a correlation of 0.43 for EMD, and it was the highest overall correlation in their data set. This is a major difference from our value of 0.17. However, they used the rank-biserial correlation instead of the Spearman correlation for EMD, arguing that it is more appropriate to use for binary values (if they use the Spearman correlation they get 0.15 for the EMD correlation). They are right, the rank-biserial correlation is preferred over Spearman in this case. However, since the rank-biserial is just the Pearson correlation between the variables, we feel it's a bit of an apples-to-oranges comparison to present both Spearman and rank-biserial side by side. Instead, we use Spearman for all factors.

Social

As in 2011, social signals were some of our highest correlated factors, with Google+ edging out Facebook and Twitter:

SEOs, on the other hand, do not think that social signals are very important in the overall algorithm:
This is one of those places where the correlation may be explainable by other factors such as links, and there may not be direct causation.
Back in 2011, after we released our initial social results, I showed how Facebook correlations could be explained mostly by links. We expect Google to crawl their own Google+ content, and links on Google+ are followed so they pass link juice. Google also crawls and indexes the public pages on Facebook and Twitter.

Takeaways and the future of search

According to our survey respondents, here is how Google's overall algorithm breaks down:
We see:
  1. Links are still believed to be the most important part of the algorithm (approximately 40%).
  2. Keyword usage on the page is still fundamental, and other than links is thought to be the most important type of factor.
  3. SEOs do not think social factors are important in the 2013 algorithm (only 7%), in contrast to the high correlations.
Looking into the future, SEOs see a shift away from traditional ranking factors (anchor text, exact match domains, etc.) to deeper analysis of a site's perceived value to users, authorship, structured data, and social signals: