This week members discuss dwell time and page speed and how much of a role exactly they may play in ranking.
Also, Snapchat unveils a new publisher tool and Google opens up Google Job Search to the dev community.
Member KernelPanic, references a 2011 blog post from bing about how they use dwell time as a ranking factor.
Ann Smarty brings up that Google is likely using dwell time but more as a method along side other attributes. Ann cites the following example,
“For example, when they were working on the authorship projects, they would use dwell time to show more articles by the author when you spent enough time on the author’s article and them clicked back to search results.”
That was a signal to Google that you wanted to see more by the author… It’s definitely dwell time in working and since we saw it clearly work, we must assume they use it in other parts of their algorithm too!”
Cre8pc notes a recent presentation from Googles’ Gary Illyes that HTTP and page speed act as a tie breaker for Google. Members discuss the costs associated with making a site secure as well as performance improvements and if its really worth it for a ‘tie breaker’.
Member iamlost mentions that even sans the potential ranking improvement, performance improvements can help with time on time and interactions.
In recent months, there has been a great deal of shifts in rankings, across a number of niches. Webmaster World members report various levels of impact in search engine results.
Oh, Snap! (Pun intended) Snapchat is now releasing a self publishing ad tool, as it plays catchup with Facebook who’s been busy stealing its features and after a lackluster first earnings report. The whole idea of the tool is to make it easier to take your TV and Youtube ads and make them snap ads. Can Snapchat stay ‘cool’ while catching up with more mature platforms?
Google has opened Job Search to developer with several key partners includingLinkedIn, Monster, DirectEmployers, CareerBuilder, Glassdoor, and Facebook who “may” already be in the mix. Webmasterworld members take a slightly cynical view as to the level of impact job search sites brands, since users will be in Googles product.
A new member has two different domains a, .de and .at targeting two different countries, and yes, ahreflang is applied. However, they’re seeing their .at presence showing up in both places. Member Keypler makes the following recommendations on adjustments to get Google to surface the correct site for the appropriate geo-location,
There’s no absolute fix, but you can start experimenting by making changes by:
• using different meta page titles
• change meta descriptions
• change H1 & H2 tags
• change content (the wording)
• use unique images
Join the discussions to contribute your thoughts and read what forum members share!
The post Dwell Time & Page Speed for Rankings & Google Job Search Opens: Weekly Forum Update appeared first on Internet Marketing Ninjas Blog.
This week has been the week of “what took so long”: Instagram is finally taking an action on fake followers and Firefox is finally addressing its speed issues.
In other discussions of note, members provide tips for how webmasters can control for “dodgy” reviews, Google provides a not-so-clear clarification on duplicate content, and more!
The consensus on Webmaster World seems to be, “what took so long?” .
There’s the “follow the money” and there’s the “brand recognition”, and then there’s “pr by spam” … all detrimental to most communities.
Glad it’s happening. Wondering why it took so long. Maybe they needed the Fake Numbers to bolster their sale to FB?
One member also noted that if Facebook was keeping up with Webmaster World User Agent and Bot ID Forum, they may have done its sooner
With the latest update, Firefox promises faster speeds and reduction in memory usage. For those of us wondering, what took so long?
According to the scoop in this article here, it seems that the issue was moving from a single process architecture to a multi-process architecture. Nich Nguyen, Product VP at Firefox, urges former users to give Firefox another chance,
“if you’ve stopped using Firefox, give it a try again.”
In a most recent case of moving to HTTPS, member asks about changing from absolute to relative link references and if this may have a potential impact.
There’s no earthly reason ever to use absolute links for your own site, except in the rare case where some areas are http-only while others are https-only.
That’s assuming when you say “relative links” you mean ones beginning in / or //. If you mean links beginning in ../ then it’s time to have a talk with your host.
Changing references from absolute to relative has been a controversial topic in the past but on this thread, members seem to agree that the relative links are fine, so long as all the steps involved are done correctly.
Cre8asiteforums Member Tam asks about dealing with dodgy reviews on her website, and member earlperl discussing methods used by other providers to control reviews
In a recent tweet, Google’s Gary Illyes, when asked to define a tweet gave an appropriately short and sweet answer,
“Think of it as a piece of content that was slightly changed, or if it was copied 1:1 but the boilerplate is different”.
Unfortunately, this provides very little guidance for webmasters in terms of threshold, especially for enterprise websites where content is mostly dynamically driven.
Members discuss crawl delay and it is still used. Member NoOneSpecial clarifies that robots.txt is obsolete, since systems are sufficiently advanced to not require it. Clarify specifically in terms of Google, NoOneSpecial says that they ignore it.
This week has been rich in interesting discussions!
Forum members have been talking about Google’s planned Chrome Adblocker, 2017 State of the Internet Trends report, Google’s drop of NOODP meta tag support and what it means for marketers and marketing conferences.
Here are our highlights:
In a perplexing move, Google announced that they plan to release their own adblocker.
To quote the referenced Wall Street Journal (WSJ) Article,
“The ad-blocking feature, which could be switched on by default within Chrome, would filter out certain online ad types deemed to provide bad experiences for users as they move around the web.”
Some of the ad types that may be affected will be those defined by the Coalition For Better Ads, the article goes on to state. As far as the ad types affected, the WSJ specifies,
“According to those standards, ad formats such as pop-ups, auto-playing video ads with sound and “prestitial” ads with countdown timers are deemed to be “beneath a threshold of consumer acceptability.”
Webmasterworld members discuss the potential modus operandi of such an initiative. Member Mack suggests,
“Google may well assume they can take over the ad-block industry. Become the de facto entity then let it slide away”.
Member Lucy goes on to mention,
“But really, isn’t an ad blocker for Chrome just another of those inevitable bandwagon-following moves, like when MSIE incorporated multi-tabbed browsing because everyone else in the universe had been doing it for years?”
Member Londrum states,
“could be a good idea. every user who uses this is one less who uses another ad-blocker, which are 100 times worse for publishers because they usually just block absolutely everything, whether it’s unobtrusive or not “
Tangor quotes a comment from an article on The Register about the degree to which such an action makes Google a gatekeeper and “arbiter of taste”. The tread closes with a comment from engine that this initiative is said to come into effect in early 2018.
Over on Cre8asiteforums.com, a similar discussion around Googles new expected Adblocker emerged. Some users were definitely in favor, citing that this will help control some of the more overeaching uses of ads. Other members. Some interesting comments included iamlost, who stated
“This is just an extension of the AMP approach to let in Google ads and control other networks. Take it with a shaker full of salt.”
and EGOL went on to say that,
“I think that Google is building ad blocking into their browser to squelch the enormous number of crappy ads that appear on some websites. Some sites are gentle with their ads and some sites are a little less gentle, but a lot of sites are blasting ads in your face the whole way down the page” .
Grumpus also commented that,
“Googlebot has no person behind it. A web browser does. Why hire a bunch of human reviewers when you can get the entire world to do it for you for free?”
Engine posts the State Of Internet Trends Report, with commentary on how webmasters may be affected. Some highlights from the thread include
cre8pc shares recent survey results for what are the most important factors for marketers, when considering going to a conference.
For cre8pc, the split between genders was most interesting, noting that the most important factors for women are cost, speakers, and location and for men, it’s cost, networking and speakers, with men choosing networking over speakers.
Members, in general, say that they do get value out of going to conferences but sometimes cost and location are restrictions that keep them from going more often.
A new user removed the dates from word press blog post URLs and experienced a dip in traffic. The user experienced a dip in organic search traffic and is wondering if they should roll this initiative out for other sites as well.
Members speak to the level of risk in changing URL structures and provide tips for when structures changes, including:
In their last Webmaster Central Blog, Google notified users that they will no longer be be supporting NOODP tag.
For those of you who are thinking, “What is NOODP” tag, is it is the tag specifies to Google that you do not want them to use the Open Directory Project as a reference for what they include as snippets for your site and that this tag can be removed.
Join the discussions to contribute your thoughts and read more!
Does click-through-rate impact Google rankings? Is Google preparing to hit guest posting links again? Will we all be forced to migrate to HTTPS? This week’s forum discussions bring up important points. Join in!
According to researhers at Check Point, Malware “Judy” bypassed Google Bouncer, Play Stores proction system. The malware creates fraudulent ad clicks on the ads. The 50 apps in the Google Play Store with the malware, have been downloaded 36.5 million times.
Google released a blog post warning webmasters about how they link from guest posts and sponsored content. In addition to cautioning about links, they also cautioned about low quality content written without sufficient subject matter expertise.
Google does state:
“Google does not discourage these types of articles in the cases when they inform users, educate another site’s audience or bring awareness to your cause or company. “
But they do say the presence of guest posting or sponsored post links may affect the whole site, not just the page the low-quality link were detected on:
“When Google detects that a website is publishing articles that contain spammy links, this may change Google’s perception of the quality of the site and could affect its ranking. “
We wrote last week about Google Adhub (beta) announcement, which seeks to connect online clicks to offline purchases using credit card numbers. This discussion continues on a new thread, in Webmaster World this week, about how exactly Adhub may work.
For consumers, there’s a fear of a great deal more tracking going on, and privacy advocates have already spoken out. However, Google has gone on to allay fears by stating that no user-level data is accessible to advertisers, and only impression data is accessible. The credit card is used to track spending, but there’s no clear indication as to how that works, and who’s sharing what data. That one may yet run and run from a privacy point of view.
Last week, we reported on a discussion on SEOChat about if CTR is a ranking factor. When we checked again this week, a lively and heated discussion was still underway.
Here is a nice SMX presentation by Gary Illyes at 2016 SMX West
I know Moz tested this and the results were temporary, but to me this suggests that CTR is a ranking factor. If CTR spikes (because Rand tells everyone to click a link), rankings might rise. But when that spike goes away a week later (when Rand is no longer doing the experiment), Google would pick up on that and drop the site back down in the rankings. I would hypothesize that a sustained boost in CTR (say, from writing a better title) would yield a sustained boost in rankings.
We have discussed the topic over at Jim and Ann show. Check it out: Bounce Rate, Long Click, Pogo-Sticking, Dwell Time, Click-Throug
A cre8asiteforums member noted that some major ecommerce sites have not yet moved to SSL. Member hypothesize as to why this may be the case. EGOL comments:
In my mind, an info site doesn’t need to be https and the display pages on retail don’t have to be https… but Google is getting out a carrot and stick to force us into it.
Join the discussions above to contribute your thoughts
This week has seen some very interesting discussions around leveraging analytics for SEO, transparency in digital advertising, and of course some interesting technical SEO questions.
Member iamlost states a case for how analytics can be an agent of change for SEOs who are making do without keyword level metrics, proposing using the data to build detailed personas that can then be used for personalization, claiming that personalization and analytics can be an agent of change for SEOs.
iamlost then discusses an architecture that can be used to make this happen. Members then go on to discuss merits of using server logs vs analytics solutions and then cloud based analytics solutions, such as Google Analytics, and on premise solutions, such as Piwik.
Tagor shows a snippet of a recent Wall Street Journal article about a recent wave of refunds to advertisers due to a metrics bug. Tangor goes on to state that this is the 5th of such bug reported by Facebook since September.
Members discuss their own experiences about how ad clicks tend to fail to “make it across the register” to Google Analytics, and discuss the overall transparency and effectiveness of Facebooks ads.
Over on Threadwatch, Google stated that its adhub is able to connect online interactions to offline purchases by tracking credit card transactions.
Although the announcement has drawn criticism from privacy advocates, such a measure is sure to further bolster ad sales due to the increased visibility into how transactions are originated.
Members discuss how twitter personalization may work and how twitter is grouping users, with sometimes amusing results.
Twitter is now ‘adding’ perceived gender (from behaviour) to accounts that have declined to specify (last sentence in quote below).
Member optisite is looking to sunset a website that has lots of backlinks but is concerned about potential impacts. Members discuss best steps to take.
Member timemachined discusses a cases where a page with an outbound link going to another page is ranking for the the target queries of the other page.
Members discuss potentials next steps to fix this issue if internal linking is causing keyword cannibalization of content.
I think the keyword linking would perform as you intend if there was enough additional content and specific keywords/phrases supporting it.
1. Anchor text within a web page is said to be a signal of what that web page is about.
2. It’s possible your page is not structured correctly to communicate what it is about. Run your code through the W3C Validator and tick the box for the OUTLINE to visualize your page hierarchy.
Join the threads to follow the discussions and contribute your own thoughts!
On a strategic angle this week, members discuss to what degree you can trust proprietary link metrics in SEO tools.
Also, members talk about how the SEO game has changed from the 1990s to present.
In the tactical realm, members react to changes in Adsense, Facebook Algo, and consider a of if a site should roll back HTTPS if Google drops your traffic after the migration.
Members provide insights on what exactly Domain Authority metric means.
For context, Domain Authority refers to a properetry Moz metric to try to gauge the “authority” of a link. This metric is typically used for backlink analysis. Member prof.stan states that the DA metric very often does not match Google algorithms.
Members discuss the overall value of such a metric with the consensus being that such metrics are created by SEO providers and are not reflective of how Google may valuate links. In one of our episodes of the Jim & Ann Show, which you can check out here.
On the top, Jim says that you can’t trust Domain Authority (DA) metrics or “toxic link” metrics because none of the available tools have as deep of an index among other reasons.
In this thread, Kim Kruse Berg muses on what makes someone an official SEO, referencing Gary Illyes naming of what he thought was top SEOs.
Kim muses if much like software developers that now work from libraries, if SEOs are now just a form of Google marketer.
Member earlpearl, shares an account of how he is observing increased aggressive monetization of local listings. Member EGOL, shares that the same seems to be true for universal search but seems to imply that we cannot fault Google for seeking to maximize profits for shareholders.
Earlpearl continues on to share accounts of several SMB’s he works on and experience trying to drive leads on the organic and paid search front.
According to member Engine Facebook has been taking action to control low quality ads in the newsfeed, stating that some actions taken include
“using artificial intelligence to identify pages and sites that have similar characteristics”.
How will this affect advertisers overall?
Engine shares Googles new “Ad Balance” feature for collapsing unfilled ad units.
Publishers share their thoughts on the feature and their experiences from trying it out.
Given the news across the forum-scape that moves to HTTPS have resulted in a loss of organic search traffic, Webmaster world members weigh if its worth going back or not.
This week has been rich in interesting updates from search and social platforms.
Our forum users have actively discussing Google’s raters getting a cut in hours, Twitter teaming up with Bloomberg, Googlebot ignoring robots.txt directives and more!
Please click the links below for more details:
In this thread, an article is referenced about Google Raters, who work through at least two different subcontractor entities are seeing their 40 hour jobs being cut in half.
A close read of the article provides and interesting overview of the types of tasks that quality raters do – the scope of their tasks are across products and include more than you think!
On the Webmaster World link mentioned in the thread, members speculate about what the potential cause is and macro trends.
In this thread, members discuss the Larry Kim post stating that “value of a Google ranking down 37% in two years? Members seem agree in general that despite increased rankings, year over year traffic still seems to be down in many cases
The discussion highlight was NewDelhiSEO’s comment that:
“It will ultimately become a game of money till users starting hating Google for too much monetization. Until then SEO is loosing value every day.”
As the publisher race to social platform develops, Webmaster World members discuss this latest development. Members also consider other content deals that Twitter is working on.
Interesting development! Wonder what will happen to Periscope, Meerkat (still alive?) and Facebook live video?
Will direct partnerships with social platforms be something that all large publisher will need to consider for the future?
Member Chedders grapples with the best way to show a gridded experience on mobile.
Options discussed include
A member points out that dispite an effort to both nofollow and robots.txt. One member points out that that any and all links to those pages would need to be nofollowed, including navigation links. Other members remark on Googles “voracious appetite” to crawl even if they found a link just once.
Other members state that Google understands patterns in affiliate links and there should be no major concern around Google finding them. Still other members recommend removing the links from any sitemaps.
Ahoy, and welcome to another weekly update from the Developer Shed Network! Our communities have been positively singing with activity lately – and we’re prepared to relay the song to you here, in blog post form.
On WebmasterWorld, forum members are discussing the ad-blocking arms race. Some researchers at Princeton and Stanford think they’ve developed a magic bullet ad-blocker, but will it stand the test of time?
In a different thread, members are talking about Google’s new AI chip – it’s so powerful that it takes the place of a dozen data centers Google would have had to build otherwise!
Then we’ll shift focus to Cre8asiteforums where members are pondering the difference between data mining and text mining, and SEO Chat where our senior members are talking about how to optimize your traffic.
It’s bound to be an update full of expert information – let’s get started!
In ancient times, man developed the club. Then, later, another man invented the helmet. Still later, another man made the gun – and after him, someone invented the bullet proof vest. The race between arms and armor can be compared to the race between ads and ad-blocking. Researchers at Princeton and Stanford believe they’ve created an ad-blocker that will conquer all online advertising that currently exists…and all that is to come.
“The ad blocker they’ve created is lightweight, evaded anti ad-blocking scripts on 50 out of the 50 websites it was tested on, and can block Facebook ads that were previously unblockable,”
according to an article from Motherboard.
WebmasterWorld member robzilla doesn’t think this is the greatest thing since sliced bread:
“It ain’t over till the fat lady sings. This isn’t ad blocking, it’s ad hiding. What’s the point of that? You waste a ton of bandwidth and it won’t make pages load any faster.”
Other members speculate that ad-blocking is actually a fading practice – many web surfers will whitelist a site if they’re asked nicely enough.
Princeton and Stanford aren’t the only ones with ad-blockers on the brain! Google will be adding ad-blocking features to the mobile and desktop versions of Chrome in the near future. A little weird from a company that runs AdSense and AdWords, no? But Google is a beast with many heads – it does have users and their positive experiences to tend to. P
eter_S wonders if, since ad-blocking will be set to “on” by default, this means that AdSense ads will escape the block. Keyplyr writes,
“If Google blocks all other ads except AdSense, it will find itself in another anti-trust suite.”
The standards appear to be defined by the Coalition for Better Ads – and that doesn’t leave AdSense off the chopping block.
Surely you’ve heard of data mining – when researchers sift through mounds of data, in this case about website visitors, to discover patterns and trends. Text mining is a branch of data mining that focuses purely on the text created by web users.
“Social scientists use text mining tools to learn about shifting public opinion; marketers use it to learn about consumers’ opinions of products and services; and it has even been used to predict the direction of stock markets,”
according to an article from libraryconnect.
EGOL on Cre8asiteforums adds that, as useful as some of that sounds, a lot of text mining is just spam.
“Try to search for a physician in a small community. Instead of finding a physician, you will find professional spam. A lot of affiliate sites that promote retail products are produced by methods that are really professional copyright infringement but the text is professionally obfuscated to make them look otherwise.”
There are plenty of sayings in life about the weather in April – but how about the “SEO weather?” The members of WebmasterWorld are seeing wild fluctuations in their website rankings and traffic lately, and it has plenty of webmasters on edge. Is a new Google update on the horizon? Has it already arrived?
“We saw half of our tracked keywords plummet between 80 – 100 places on Tuesday 18th, some were previously on page 1 in solid positions now on page 10+,”
writes pavsid. Reseller writes that
“RankRisk showing high level of Google Desktop SERP Fluctuations…Of course it could be the start of algorithm update, who knows :)”
What’s going on? Head over to WebmasterWorld to hear the latest gossip and find out!
This is a topic of great personal interest to me! Every webmaster out there seems to be obsessed with getting the MOST traffic they can – why aren’t they focused on getting the BEST traffic instead? Two users who convert are better than fifty who don’t, isn’t that true? Pierre Benneton on SEO Chat writes that
“I noticed that most of my leads and clients only see SEO work as the generation of the maximum possible traffic to a website but they do not understand the importance of the traffic quality. At several occasions I’ve totally repositioned websites on Google…by targeting different keywords with a lower number of monthly searches…”
Prof.stan also has a great post in this thread that describes strategies to communicate to your client about the value of specific traffic avenues. Sometimes it doesn’t matter how well you do your job – you have to be able to prove it in writing too!
This is no potato chip! Google discovered that voice search was about to put a huge load on their data centers. They’d need a dozen or more new centers to process all the voice requests. Instead of going big, they went for efficiency instead – and developed an AI chip that can process these voice requests through “deep neural networks.”
The chip is called the Tensor Processing Unit.
“Next they should build a chip in our minds, that would save a lot of data centers,” mosxu quips. Ergophobe writes, “I see far-reaching implications here, way beyond what they cover in that article. What does this article tell you about the future of SEO?”
Give this a read and share your own answer!
The mobile-first web, the mobile-first web: that’s all we’ve been talking about for the last two or three months, it feels like. And yet, the mobile-first web is still months away from being here! What’s the hold up? Well, if you’ve been following the news, Google wants the mobile-first index to have a “neutral” impact on its SERP quality.
Part of the problem in achieving that is the difference between desktop and mobile link graphs. Mobile sites and users simply don’t use as many links as desktop sites do. That begs the question – once Google figures it out, how will the link graph be impacted?
“John Mueller responded to a tweet that he doubts it will have a major impact,”
writes Threadwatch contributor Mr-X. But how can it not with such a massive disparity between links on mobile and desktop sites? The mechanism that Google chooses to close the gap will be very interesting to study, that much is for sure!