Showing posts with label google algorithm. Show all posts
Showing posts with label google algorithm. Show all posts

Tuesday, April 8, 2014

Matt Cutts’s explanation about how Google Tests Its Algorithms

Have you ever been interested about how Google decides which algorithm is better than another, when they're pushing out one of the many tweaks they do weekly? How do they judge which tweak really produces better results and which produces lots of good results? Or does the spam team just wave a nerve bat over the server before beating a big red button and hope for the best?

Google's Matt Cutts spills the beans on how the search team actually does it in a webmaster help video, which asks what metrics Google uses to evaluate whether one iteration of the ranking algorithm is delivering better excellence results to users than another.

While Cutts starts off saying that he could geek out on this topic for fairly some time, and I'm sure many of us would love to attend to him does just that, he said he will try and hold back for the sake of video length.


"Whenever an engineer is evaluating a new search quality change, and they want to know whether it's an development, one thing that's useful is we have hundreds of excellence raters who have previously rated URLs as good or bad, spam, all these sorts of dissimilar things.
"So when you make a change, you can see the flux, you can see what moves up and what moves down, and you can look at example searches where the results tainted a lot for example," he said. "And you can say OK, given the changed search results, take the URLs that moved up, were those URLs typically higher rated than the URLs that moved down by the search quality raters?"

While Google tries to keep the details of their quality rater guidelines secret, they inescapably end up getting leaked. The most recent version became known in November and detailed precisely what quality raters are looking for when they rate search results.
"Sometimes since these are recomputed numbers, as far as the ratings, we've already got a stored data bank of all those ratings from all the raters that we have, at times you'll have question marks or empty areas where things haven't been rated," he said. 

"So you can also send that out to the raters, get the results either side-by-side, or you can look at the character URLs, and they say in a side-by-side this set of search results is better, or this set is better or they might say this URL is good in this URL is spam, and you use all of that to assess whether you're making good development."

While it is good that Google pushes these kinds of things that the excellence raters to see what they notice, it doesn't always catch everything. There definitely been times when new tweaks to break something, such as what we saw with entertainment sites that significantly declined in the rankings in February, the quality raters don't always catch.

"If you make it further along, and you're getting close to trying to launch something, often you'll launch what is called a live experiment where you really take two different algorithms, say the old algorithm and the new algorithm, and you take results that would be generated by one and then the other and then you might interleave them. And then if there are more clicks on the newer set of search results, then you tend to say you know what, this new set of search results generated by this algorithm power be a little bit better than this other algorithm.

This is interesting how he is describing interleaving the two sets of search results, as normally we hear about either full pushes, or pushes to a small percent of users. However this could be a live experiment limited strictly to Google employees and quality raters.

He does say that from within Google, the web spam team is metrics can look quite different from the rest of Google, simply because they like to click on spam and see what's ranking, why it is ranking, and to better figure out how to get rid of it.
 
"Sometimes our metrics look a little bit worse in web spam because people click on the spam, and we're like we got less spam, and it looks like people don't like the algorithm as much," he said. "So you have to take all those ratings with little bit of a grain of salt, because nothing replaces your judgment, and the judgment of the quality launch committee."

The quality launch committee is actually not that well known, but it is simply a group of the search quality engineers that receives reports and has meetings regarding search quality, something which Matt has mentioned at least once in previous webmaster help videos.
He continues by talking a little bit about what exactly the quality raters are looking for when they're doing their ratings.

"People can rate things as relevant on a scale, they can rate things as spam, they can even rate the quality of the page, which it sort of does it matter based on the query, but how reputable the page is itself," Cutts said. "And then we have metrics that blend all of that together and when we're done we say okay in general we think that the results got a little bit better, and here the kinds of ways they got better or worse. 

We can even slice and dice and look at different countries or different languages, that sorts of stuff. So in web spam we're not that surprised if users continue to click on the spam, because we can recognize the spam, we have expert raters on those kinds of topics. And we pay special attention to special countries where we know there's more spam and so we can see the sorts of reactions we get there."
Even continues and talks a bit about how every so often they go through and update the quality rating guidelines, something we've seen updated multiple times over the years.

"So we've got it down to a pretty good system," Cutts said. "Every so often we have to revive a process and look at how to improve it, but for the most part things up relatively well in terms of assessing what are the big changes, once you see those big changes it gives you ideas to go back and improve and make things better, and by the time we get to launch committee, normally everybody has a pretty good idea about whether it works and what the strengths and weaknesses of a particular algorithm are.

So if you had visions of Matt Cutts sitting in his office with a big red button on his desk to unleash some new algorithm without any feedback or oversight, you are probably disappointed. There is actually a lot that goes on with testing algorithms, particularly the large ones, and they do get put through the ringer before they go live to remain, to ensure that Google is serving up better search results than the previous algorithm.


You might also like: Google Penalized Link Networks 

Wednesday, March 19, 2014

Influence of Google Panda update in Small Businesses Ranking

google news
 Google is working on a restore of Panda, a "kinder, softer Panda," Google's Distinguished Engineer Matt Cutts announced at SMX last week. The aim of the next generation of Panda is to help small businesses that may have been impacted by earlier versions of the algorithm that Google has unleashed periodically since the initial launch in February 2011.

Panda was firstly launched as Google's algorithmic answer to low quality and thin content sites that had gained famous rankings in the search results.

While Panda was initially used as battling content farms, the side effect was huge for less authoritative sites, a category that many small websites fall into, regardless of whether they had stellar or poor quality content.


The Panda algorithm was mainly hard on small businesses and greatly decreased their search visibility when compared to larger or "big brand" types of sites that Google seems to favor. This is particularly true for product related searches where sites like Amazon or large retailers dominate the results and smaller sites just can't compete, even if they offer better service or prices.

The same applies for websites offering local services, such as a local real estate agent or a local exterminator. While local search can solve some of these problems, there remains the issue that their content doesn't rank well for usual searches.

On a WebmasterWorld thread on the theme, user EditorialGuy gave an outstanding description of how Panda affects small businesses, using a fake example search:

Take a query like "armadillo grooming tools." In the current Google results, such a inquiry might yield product listings from Amazon, Target, Walmart, Petsmart, Petco, and so on. If I were a Google search engineer testing that query, I'd want to see a mixture of name-brand results and results for specialist sites like armadillofancy dot com that present unique content and show a real understanding of and passion for armadillos."

Last year, Cutts seemed to acknowledge that Panda was having a pretty important impact on smaller businesses. In August he began asking for examples of small quality sites that weren't ranking well in Google. At the time, many small sites that had been negatively impacted by Panda were hopeful that an upcoming Panda update would give them a bit of a improve in the rankings.

There's no word on when webmasters might see the Panda refresh, but since it Cutts mentioned it, we can maybe expect to see it live sometime within the next few months.


 you might also like:  How Algorithm Updates influence SEO

Thursday, March 6, 2014

Google ‘s new Analytics Academy Class Starting Next Week

 A post went up on Google’s authorized Webmaster Central Blog last night from a representative of the Search Quality Team providing guidelines for how to find out if your site has been hacked, as well as fix it and prevent future incidents.

Since hacking is astonishingly common I felt it was significant to pass along this information to SEJ readers. Please take a minute or two to review these tips, even if you don’t think you may be a victim of hacking. No one ever expects their site to get hacked, so it’s good to be prepared in the unfortunate event that it does occur.

Adding spammy pages are the most general way hackers take advantage of vulerable sites, Google says. Hackers add spammy pages to redirect users to undesired or harmful destinations. For example, Google says they have seen a rise in hacked sites redirecting visitors to online shopping sites.

Here are some tips Google provides to help you identify hacked content on your site:

•    Check for for shady looking URLs or directories: You can check for any kind of shady activity on your site by performing a “site:” search of your site in Google.If there are there any suspicious URLs or directories that you do not recognize, they may have been added by a hacker.

•    Check the Search Queries page in Webmaster Tools for unnatural looking queries: The Search Queries page shows Google Web Search queries that have returned URLs from your site. Look for unexpected queries as it can be an indication of hacked content on your site.

•    Turn on email forwarding in Webmaster Tools: Google will send you a message if they detect that your site may be compromised. Messages appear in Webmaster Tools’ Message Center but it’s a best practice to also forward these messages to your email.
Here are some tips Google provides for how to fix and prevent hacking:

•    Stay informed The Security Issues section in Webmaster Tools will show you hacked pages detected on your site. Google also provides detailed information to help you fix your hacked site.

•    Protect your site from potential attacks: Prevent attacks by keeping the software that runs your website up-to-date, sign up to get the latest security updates for your website management software, and choose a provider that you can trust to maintain the security of your site.

Google also reminds you can help keep the web safe by reporting sites you believe may have been hacked. If you find suspicious sites in Google search results, you can report them using the Spam Report tool. 

 you might also like: Google Analytics-URL target Goals 


Thursday, October 24, 2013

The shock of Penguin 2.1

seo tips

On October 4th, Matt Cutts announced the release of Penguin 2.1.Similar to previous Penguin updates;

This post is planned to give you a peek behind the ended, into the world of Penguin. I will focus on three different websites, with three different outcomes.

A Penguin Recovery during the 2.1 Update:

The company was initially hit by a previous Penguin update, but late tackling their link issues as they worked on technical problems and content issues.
During late spring and summer, unnatural links were detached as much as possible, while links that could not be manually removed were disavowed. By the way, that’s the come up to I recommend. I’m not a big fan of disavowing all bad links, and I never have been.

Based on links downloaded from Google Webmaster Tools, Majestic SEO, and Open Site Explorer, the company tackled its unnatural link state the best it could. Now they just needed another algorithm update to see if their hard work paid off.

 I advise to any company hit by an algorithm update that they should keep driving ahead as if they weren’t hit. Keep producing great content, keep leveraging social to get the word out, keep building natural links, etc.

Key Takeaways:

•    Move quickly and keep a strong focus on what you need to tackle link-wise. Even though this company recovered, it delayed its Penguin work for some time.

•    Be thorough. Don’t miss links you need to nuke. Penguin is algorithmic and there is a threshold you need to pass.

•    Remove as many unnatural links as you can manually, and then disavow the rest.

2. A Penguin 2.0 and 2.1 grouping Punch

You wouldn’t think a Penguin can pack a grouping punch, but it has in several situations I’ve analyzed recently and worse, this was after thinking they addressed their unnatural link problem thoroughly.

After getting pummeled by Penguin 2.0 on May 22nd, the business gathered its troops, thought they identified all of their unnatural links, and worked hard on removing them. After what seemed to be a thorough attack, they eagerly awaited another Penguin update. When Penguin 2.1 was announced by Matt Cutts, they watched their reporting with intense focus, only to be thoroughly disappointed with the outcome. They got hit even worse.


The Second Penguin Hit

Quickly reviewing the site’s link profile exposed a problem: companies put a stake in the ground and remove as many unnatural links as they can at a given point in time. They don’t continue analyzing their links to see if more unnatural links pop up and that’s a dangerous mistake. 

I saw many unnatural links in their profile that were first found during the summer and fall of 2013. Many showed up after their Penguin work had been completed. Those links are what got them hit by Penguin 2.1.

Fresh Unnatural Links Caused the Penguin 2.1 Hit:

 Don’t think you are done with your link removals because you have a spreadsheet from a few months ago. You need to repeatedly review your link profile to identify potential problems. If this company had done that, they would have picked up many additional unnatural links showing up this summer and fall, and dealt with them so. I believe if they did, they could have avoided the nasty one-two punch of Penguin.

Key Takeaways:

•    Your Penguin work is continuing. Don’t drop the ball.
•    Have your SEO continually monitor your link profile for unnatural links (whether that’s an internal SEO, agency, or consultant).
•    The one-two punch of Penguin is killer (and can be backbreaking).
Unnatural links have an uncanny way of replicating across low-quality sites and networks. I have clearly seen this during my Penguin analyses. Beware.

Summary: Penguin 2.1 Bringeth and Taketh Away

If you’ve been impacted by Penguin 2.1, you want to download and analyze your inbound links, flag unnatural links, take out as many as you can manually, and then disavow what you can’t remove. As I mentioned in the second case above, don’t stop analyzing your links once the primary phase has been completed. repeatedly monitor your link profile to make sure additional unnatural links don’t appear. Remember, another Penguin update might be right around the corner.

Monday, June 17, 2013

How to get ready For Google’s Summer Algorithm Plans

On Monday, May 13, 2013, Matt Cutts released a very interesting video to the Google Webmaster Help YouTube channel. The video, titled What should we wait for in the next few months in terms of SEO for Google?, covered a lot of ground, including upcoming changes to Penguin, Panda, SERP clustering, hacked sites, and Author Authority.

Give a look to the video for yourself below:

With so many areas being affected, let’s look at the best ways for you to protect against the upcoming changes.

google's attention


Audit Your Links

The original Google Penguin update hammered sites that had insistently built links in the past. Most of those websites have not yet recovered, and some of them never will. By labeling the next major update as Penguin 2.0, Matt is purposely refusing to split hairs. This has the look of a significant upheaval for anyone still trying to game the system with any link they can get. He alludes to going “a little deeper” and it “having a little more impact.” Don’t let the word “little” fool you – this is “a little” deeper than something that caused ripples across the internet.

Paid link schemes are clearly a goal for the next round. Google showed a modicum of constraint in the first wave by only penalizing obvious schemes such as BuildMyRank (recently renamed as HP Backlinks).

Now, they have had over a year to root out anyone else playing the same game. Common sense suggests that they will set their sights on a much wider range of paid link services this go around.

The smart webmaster is pausing any paid link gaining efforts and doing a detailed audit of their existing link inventory.

Will this slow momentum? Yes, temporarily.

But will it help avoid further headaches? Probably, but it’s best to be safe.

Most of all hold your SEO provider’s feet to the fire. If you don’t already know how they are acquiring links on your behalf, have them share and defend their link acquisition strategy. Be sure you are both comfortable that this strategy is in line with white hat SEO practices and Google’s quality guidelines. As Matt says in the video, the best way to keep away from penalty is to play by the rules.

Get Off Spammy Directories and Blog Networks

Although it was not overtly stated, we can be pretty convinced that Google will penalize or deindex more blog networks and low quality directories. This is not a tidal wave you want to be caught up in. Keep in mind that this piece of the original update was a manual penalty. I don’t know how long they had been working on it before it went live, but we do know that they have now had at least 13 months to recognize additional sites to penalize.

When auditing your links, pay special attention to directories and blog networks. Start the procedure of removing links from low authority, spammy sites that have no vetting process at time of submission.

If you find one that charges a fee for being listed but has no approval process, move them to the top of the list for removal request, and later disavow if needed. This is the correct profile that Google went after last April.

Change Your Link Building Practices

With Penguin 2.0 looking like a very big deal, force the issue on link building practices today. As Matt advises, follow the rules and you should be fine. We have seen many columns about Link Earning rather than Link Building. This approach takes much more effort and time, but it also insulates you against surprises such as a major slap from Google.

Once your domain is penalized, it is likely too late. The best course of action is to stay away from it altogether.

Embrace Quality Content Marketing

The backbone of Link Earning is a hot topic today: Content Marketing.

Yes, you need site authority. But that is nowhere near the end of the story. Google is extending authority to authors and even publishers, both of which go beyond the website to the entity.

In preparing to kick off a content campaign, be sure to have the necessary markup in place to establish yourself as an author or publisher.

When starting to produce original content, hold yourself to the highest standards. Look at what keywords might drive good traffic. Take time to brainstorm high interest and engaging topics.

Write with passion and tell a story. Make us want to read your work!

If you put back old school link building with content marketing, there are multiple benefits.

Avoidance of penalties and surprises.
Increase in the number of pages indexed with the search engines.
Rapid expanded keyword coverage, which will add to search queries for which you will earn impressions.
You establish yourself and your brand as experts in your chosen topic or field.

Clean Up Any Existing Problems

Just to be safe, be sure you are aware of any problems on your site that need to be addressed. Google never outlines every single feature of their algorithm updates and penalties in advance. The safe way to go is to get everything in order before the updates hit.

Review every segment of your Google Webmaster Tools account. If there are issues, take care of them now just to be safe.

How many pages are indexed?
Is your XML sitemap working right?
How many crawl errors are the spiders logging?
What keywords are you ranking for now?
Who links to you? Are they applicable and reputable?
What content is receiving these links?
Is your page load time and overall site performance acceptable?
Is there a list of HTML improvements that you should be fixing?

These are problems we should all be addressing anyway. It never hurts to have the technical SEO aspects all in order. This shows the search engines that you take your site, its code, and the overall performance seriously.


Tuesday, June 4, 2013

Matt Cutts: Google Didn't Make Panda & Penguin to Force People to purchase Ads

Are algorithmic updates created exclusively to force webmasters to buy ads and increase Google's bottom line?

It's no conspiracy that Google wants to make profits, but Google's Distinguished Engineer Matt Cutts has come out swinging against an often repeated "conspiracy theory," arguing that all Google updates are designed only to get better the user experience.

In a new video, Cutts also addressed the dissimilarity between a data refresh and an algorithm update, and where he believes SEO professionals are spending too much time and energy.




Algorithm Update vs. Data Refresh

Many webmasters are confusing what is an algorithm update and what is merely a data refresh.

"When you’re changing your algorithm, the signals that you’re using and how you weight those signals are fundamentally changing," he said. "When you are doing just a data refresh then the way that you run the computer program stays the same, but you might have dissimilar incoming data, you might refresh the data that the algorithm is using. That’s something that a lot of people just don’t seem to necessarily get."

Cutts has formerly explained the difference between updates and data refreshes on his blog.

Google Conspiracy Theory: More Updates = More Revenue

Cutts also tackled the persistent rumor that the reason Google does updates like Panda and Penguin isn’t to reduce spam but is to actually increase revenue. But Matt points out that if you look at Google’s quarterly statements, Panda in fact caused revenue to drop.



I have seen a lot of accusations after Panda and Penguin that Google is just trying to boost its revenue, and let me just tackle that head on. Panda, if you go back and look at Google’s quarterly statements, they actually talk about that Panda decreased our revenue. So a lot of people have this conspiracy theory that Google is making these changes to make more money. And not only do we not think that way in the search quality team, we’re more than happy to make changes which are better for the long term loyalty of our users, the user experience, and all that sort of stuff. And if that’s a short term revenue hit, then that might be okay, right, because people are going to be coming back to Google long term.

So a lot of people, it’s a usual conspiracy theory… Google did this ranking change because they want people to buy more ads and that is surely not the case with Panda, it’s certainly not the case with Penguin, and so it’s kind of funny to see that as a meme within the industry and it’s just something that I wanted to debunk that misconception. Panda and Penguin we just went ahead and made those changes and we aren’t going to worry if we lose money or make money or whatever, we just want to return the best user’s results we can.

Pay Attention to Marketing & Make Something Compelling

Next, he tackled what he thought was where SEOs are spending too much time. He thinks people are spending too much time on links and maybe not enough time on social media. He also thinks people are missing out on the user experience they could be working on instead.

A lot of people think about “How do I build more links?” and they don’t think about the grander, global picture of “How do I make something compelling, and then how do I make sure that I market it well?” You know, you get too focused on search engines, and then you, for example, would entirely miss social media and social media marketing. And that’s a huge way to get out in front of people.

So specifically I would think, just like Google does, about the user experience of your site. What makes it compelling? What makes it interesting? What makes it fun? Because if you look at the history of sites that have done moderately well or businesses that are doing well now, you can take anywhere from Instagram to Path, even Twitter, there’s a cool app called YardSale, and what those guys try to do is they make design a fundamental piece of why their site is advantageous to go to. It’s a great knowledge. People enjoy that.

So you might not just pay attention to design you could pay attention to speed or other parts of the user experience. But if you really get that sweet spot of something compelling where the design is really good or the user experience just flows, you’d be amazed how much growth and traffic and traction you can get as a result.

He also brings up that webmasters should carry on to improve, because if you do not evolve, others will come along, think about how they could do it better, and then jump in and surprise you.