Friday, May 30, 2014

4 Mistakes Agencies Make to Lose Clients

Not understanding the client's business or economics, not doing enough keyword and competitive analysis, using a generic proposal template, and lack of communication and reporting the wrong metrics will cost you future business and current clients.

Is comScore's Google Search Share Data False? Not Exactly

Search market share data from comScore's isn't invalid or incorrect. But new research from Conductor is also pretty consistent with what many webmasters find when it comes to actual click-throughs to a site from Google search results.

Thursday, May 29, 2014

Fetch as Google Adds Page Rendering

Google has added to the Fetch as Google tool in Webmaster Tools, allowing webmasters to render their webpage as Googlebot sees it and address any issues. You can Fetch and Render for Desktop, Mobile: Smartphone, Mobile: XHTML/WML or Mobile: cHTML.

Two Links, Different Anchor Text, Same URL: Does it Matter to Google?

In his latest Webmaster Help video, Google's Distinguished Engineer Matt Cutts discusses what happens to the flow of PageRank when Google crawls a page and finds two links on a page pointing to the same target, but each uses unique anchor text.

Want to Increase Your CTR by Nearly 50%? Consider Search Refinements

It's far more common for searchers to enter multiple phrases in rapid succession, building on each preceding one, until they find what they seek. Users are indeed more likely to click on advertisers who appear more often in the same session.

Here's Why Your Editorial Calendar Isn't Working...

They said that content was king and that editorial calendars would be our pathway to the kingdom. Maybe all of that is true. But if your editorial calendar isn't producing the results you had hoped for, here's why and what you can do about it.

What Search Strategists Overlooked in the New York Times Leaked Innovation Report

The authors of the Innovation Report were ruthlessly introspective about their search visibility, ranking, and optimization. Here are some highlights, including a number of obstacles the team has faced in terms of search marketing over the years.

Wednesday, May 28, 2014

Please Remove My Link. Or Else.




Getting links removed is a tedious business.


It’s just as tedious for the site owner who must remove the links. Google’s annoying practice of "suggesting" webmasters jump through hoops in order to physically remove links that the webmaster suspects are bad, rather than Google simply ignoring the links that they’ve internally flagged, is causing frustration.


Is it a punitive punishment? If so, it’s doing nothing to endear Google to webmasters. Is it a smokescreen? i.e. they don't know which links are bad, but by having webmasters declare them, this helps Google build up a more comprehensive database? Bit of both? It might also be adding costs to SEO in order to put SEO out of reach of small companies. Perhaps it’s a red herring to make people think links are more important than they actually are.


Hard to be sure.


Collateral Damage


SEOs are accustomed to search engines being coy, punitive and oblique. SEOs accept it as part of the game. However, it becomes rather interesting when webmasters who are not connected to SEO get caught up in the collateral damage:



I received an interesting email the other day from a company we linked to from one of our websites. In short, the email was a request to remove links from our site to their site. We linked to this company on our own accord, with no prior solicitation, because we felt it would be useful to our site visitors, which is generally why people link to things on the Internet.



And check out the subsequent discussion on Hacker News. Matt Cutts first post is somewhat disingenuous:



Situation #1 is by far the most common. If a site gets dinged for linkspam and works to clean up their links, a lot of them send out a bunch of link removal requests on their own prerogative



Webmasters who receive the notification are encouraged by Google to clean up their backlinks, because if they don’t, then their rankings suffer.


But, essentially from our point of view when it comes to unnatural links to your website we want to see that you’ve taken significant steps to actually remove it from the web but if there are some links that you can’t remove yourself or there are some that require payment to be removed then having those in the disavow file is fine as well.

(Emphasis mine)


So, of course webmasters who have received a notification from Google are going to contact websites to get links removed. Google have stated they want to see that the webmaster has gone to considerable effort to remove them, rather than simply use the disavow tool.


The inevitable result is that a webmaster who links to anyone who has received a bad links notification may receive the latest form of email spam known as the “please remove my link” email. For some webmasters, this email has become more common that the “someone has left you millions in a Nigerian bank account” gambit, and is just as persistent and annoying.


From The Webmasters Perspective


Webmasters could justifiably add the phrase “please remove my link” and the word "disavow" to their spam filters.


Let’s assume this webmaster isn’t a bad neighbourhood and is simply caught in the cross-fire. The SEO assumes, perhaps incorrectly, the link is bad and requests a take-down. From the webmasters perspective, they incur a time cost dealing with the link removal requests. A lone request might take a few minutes to physically remove - but hang on a minute - how does the webmaster know this request is coming from the site owner and not from some dishonest competitor? Ownership takes time to verify. And why would the webmaster want to take down this link, anyway? Presumably, they put it up because they deemed it useful to their audience. Or, perhaps some bot put the link there - perhaps as a forum or blog comment link - against the webmasters wishes - and now, to add insult to injury, the SEO wants the webmaster to spend his time taking it down!


Even so, this might be okay if it’s only one link. It doesn't take long to remove. But, for webmasters who own large sites, it quickly becomes a chore. For large sites with thousands of outbound links built up over years, removal requests can pile up. That’s when the spam filter kicks in.


Then come the veiled threats. “Thanks for linking to us. This is no reflection on you, but if you don’t remove my link I’ll be forced to disavow you and your site will look bad in Google. I don’t want to do this, but I may have to.”


What a guy.


How does the webmaster know the SEO won’t do that anyway? Isn’t that exactly what some SEO conference speakers have been telling other SEOs to do regardless of whether the webmaster takes the link down or not?


So, for a webmaster caught in the cross-fire, there’s not much incentive to remove links, especially if s/he's read Matt's suggestion:



higherpurpose, nowhere in the original article did it say that Google said the link was bad. This was a request from a random site (we don't know which one, since the post dropped that detail), and the op can certainly ignore the link removal request.



In some cases Google does specify links:



We’ve reviewed the links to your site and we still believe that some of them are outside our quality guidelines.


Sample URLs:

http://ift.tt/1on5b8R


Please correct or remove all inorganic links, not limited to the samples provided above. This may involve contacting webmasters of the sites with the inorganic links on them.



And they make errors when they specify those links. They've flagged DMOZ & other similar links: "Every time I investigate these “unnatural link” claims, I find a comment by a longtime member of MetaFilter in good standing trying to help someone out, usually trying to identify something on Ask MetaFilter."


Changing Behaviour


Then the webmaster starts thinking.


"Hmmm...maybe linking out will hurt me! Google might penalize me or, even worse, I’ll get flooded with more and more “please remove my link” spam in future."


So what happens?


The webmaster becomes very wary about linking out. David Naylor mentioned an increasing number of sites adopting a "no linking" policy. Perhaps the webmaster no-follows everything as a precaution. Far from being the life-giving veins of the web, links are seen as potentially malignant. If all outbound links are made no-follow, perhaps the chance of being banned and flooded with “please remove my link”spam is reduced. Then again, even nofollowed links are getting removal requests.


As more webmasters start to see links as problematic, fewer legitimate sites receive links. Meanwhile, the blackhat, who sees their sites occasionally getting burned as a cost of doing business, will likely see their site rise as they’ll be the sites getting all the links, served up from their curated link networks.


A commenter notes:



The Google webspam team seems to prefer psychology over technology to solve the problem, especially recently. Nearly everything that's come out of Matt Cutt's mouth in the last 18 months or so has been a scare tactic.

IMO all this does is further encourage the development of "churn and burn" websites from blackhats who have being penalized in their business plan. So why should I risk all the time and effort it takes to generate quality web content when it could all come crashing down because an imperfect and overzealous algorithm thinks it's spam? Or worse, some intern or non-google employee doing a manual review wrongly decides the site violates webmaster guidelines?



And what’s the point of providing great content when some competitor can just take you out with a dedicated negative SEO campaign, or if Google hits you with a false positive? If most of your traffic comes from Google, then the risk of the web publishing model increases.


Like MetaFilter:



Is Google broken? Or is your site broken? That’s the question any webmaster asks when she sees her Google click-throughs drop dramatically. It’s a question that Matt Haughey, founder of legendary Internet forum MetaFilter, has been asking himself for the last year and a half, as declining ad revenues have forced the long-running site to lay off several of its staff.



Then again, Google may just not want what MetaFilter has to offer anymore.


(In)Unintended Consequences


Could this be uncompetitive practice from Google? Are the sites getting hit with penalties predominantly commercial sites? It would be interesting to see how many of them are non-commercial. If so, is it a way to encourage commercial sites to use Adwords as it becomes harder and harder to get a link by organic means? If all it did was raise the cost of doing SEO, it would still be doing its job.


I have no idea, but you could see why people might ask that question.


Let’s say it’s benevolent and Google is simply working towards better results. The unintended consequence is that webmasters will think twice about linking out. And if that happens, then their linking behaviour will start to become more exclusive. When links become harder to get and become more problematic, then PPC and social-media is going to look that much more attractive.





4 Steps to Maximize Local Search Success

Extending a local experience throughout the entire process should be the goal of all local marketing campaigns. Relevant search ads, localized landing pages, owned and earned media, and a blended cost analysis all provide a recipe for success.