This week members discuss dwell time and page speed and how much of a role exactly they may play in ranking.
Also, Snapchat unveils a new publisher tool and Google opens up Google Job Search to the dev community.
Member KernelPanic, references a 2011 blog post from bing about how they use dwell time as a ranking factor.
Ann Smarty brings up that Google is likely using dwell time but more as a method along side other attributes. Ann cites the following example,
“For example, when they were working on the authorship projects, they would use dwell time to show more articles by the author when you spent enough time on the author’s article and them clicked back to search results.”
That was a signal to Google that you wanted to see more by the author… It’s definitely dwell time in working and since we saw it clearly work, we must assume they use it in other parts of their algorithm too!”
Cre8pc notes a recent presentation from Googles’ Gary Illyes that HTTP and page speed act as a tie breaker for Google. Members discuss the costs associated with making a site secure as well as performance improvements and if its really worth it for a ‘tie breaker’.
Member iamlost mentions that even sans the potential ranking improvement, performance improvements can help with time on time and interactions.
In recent months, there has been a great deal of shifts in rankings, across a number of niches. Webmaster World members report various levels of impact in search engine results.
Oh, Snap! (Pun intended) Snapchat is now releasing a self publishing ad tool, as it plays catchup with Facebook who’s been busy stealing its features and after a lackluster first earnings report. The whole idea of the tool is to make it easier to take your TV and Youtube ads and make them snap ads. Can Snapchat stay ‘cool’ while catching up with more mature platforms?
Google has opened Job Search to developer with several key partners includingLinkedIn, Monster, DirectEmployers, CareerBuilder, Glassdoor, and Facebook who “may” already be in the mix. Webmasterworld members take a slightly cynical view as to the level of impact job search sites brands, since users will be in Googles product.
A new member has two different domains a, .de and .at targeting two different countries, and yes, ahreflang is applied. However, they’re seeing their .at presence showing up in both places. Member Keypler makes the following recommendations on adjustments to get Google to surface the correct site for the appropriate geo-location,
There’s no absolute fix, but you can start experimenting by making changes by:
• using different meta page titles
• change meta descriptions
• change H1 & H2 tags
• change content (the wording)
• use unique images
Join the discussions to contribute your thoughts and read what forum members share!
The post Dwell Time & Page Speed for Rankings & Google Job Search Opens: Weekly Forum Update appeared first on Internet Marketing Ninjas Blog.
Google launched more than 1,600 changes to their algorithm last year. At that rate, it’s possible that 800 have already been implemented in the first half of 2017.
Of course, that doesn’t mean there are 800 potential impacts to your SEO strategy. Really, only ten of the changes that have been noticed this year have clear implications on the future of SEO. But those ten changes have major implications.
If you haven’t had time to keep up with the endless cycle of SEO news and Google changes, there’s no need to worry. Here’s a quick guide to the important things you might have missed, and what they mean for your SEO strategy in 2017 and beyond.
So far this year, Google has launched a number of new features designed to make search results more user-friendly—and to claim more clicks for themselves:
What this means for the future:
Google is never going to back off trying to keep users on their own pages, and B2B industries are not exempt.
First, make sure your SEO strategy is converting searchers into customers and brand advocates. SEO will always be important, but the best SEO programs are paired with email marketing, social media marketing, and other initiatives that enable reaching audiences directly.
Second, don’t brush off SERP updates that seem to only apply to B2C keywords or industries. Those queries might offer the most immediate or easiest results for Google, but they will expand into B2B industries if and when they can.
Last year, Google announced an impending ranking penalty for sites that display intrusive interstitials before users can access content.
Then, in April, The Wall Street Journal reported that Google may be planning a built-in ad-blocking feature in a future version of its Chrome browser. On the surface this doesn’t make much sense—Google earns its money from ads so it wouldn’t make sense to block them. However, the blocker is not designed to block all ads—just “unacceptable ad types” like prestitial ads, popups, and auto-playing videos.
What this means for the future:
One way or another, Google intends to improve the user experience by eliminating intrusive ads. If you’re still using these formats to gain newsletter subscribers, it’s time to rethink your approach. There are a variety of alternative approaches:
Marketo’s main blog page displays a newsletter sign-up form at the top of the right sidebar.
This form appears below every post on Propecta’s blog.
Banner ads are one of Google’s examples of acceptable, non-intrusive ad types.
HTTPS has been a minor ranking factor for a while now, and the small boost it offered may not have been enough to encourage sites to go through the process of converting. In May, however, Google announced that all pages that require users to enter data will have URLs prefaced with a “Not secure” warning in the omnibox starting in October.
What this means for the future:
While it may not happen this year, eventually the warning will appear before all HTTP URLs. This warning may deter users from spending time on your site, even if they’re just browsing a blog post that doesn’t require any submission of data. To play it safe, it’s best to take steps to convert to HTTPS now so you’ll have plenty of time to get through the process before the warnings begin displaying.
AMP results debuted in 2016 and primarily appeared for news-related queries. But since then, AMP results and carousels have been multiplying.
Although AMP results are only particularly important for news publishers at the moment, the quick expansion of AMP results suggests that Google believes AMP results provide more optimal user experiences.
What this means for the future:
Some have predicted that AMP will become a ranking factor. Whether that’s true or not is yet to be seen, but what is obvious is that Google believes that AMP provides a better search experience.
If that’s the case, it’s not a stretch to imagine that Google will expand AMP results to display for other, non-news industries. For that reason, it’s a good time for all digital publishers to assess the potential benefits of implementing AMP.
Amazon’s Echo was the online retailer’s best-selling product during the 2016 holiday season, and it’s anticipated that nearly 25 million additional Amazon Echo and Google Home devices will be sold this year. Greater adoption of personal assistants will inevitably lead to increased voice search queries. In fact, 50% of all queries are expected to be conducted via voice search by 2020. Additionally:
A small “Feedback” link now appears below featured snippets, allowing users to report inaccurate or offensive content.
What this means for the future:
Voice search will continue to grow, and Google will continue to seek ways to provide the most accurate and highest quality answers to voice search queries. That means that SEOs must begin forming voice search optimization strategies. This includes targeting natural language keywords, optimizing for featured snippets, and ensuring sites are mobile friendly.
The most critical task for SEOs and their leaders now—and in the future—is to keep the big picture in mind.
Google’s goal has always been, and will always be, to provide the best user experience possible: it’s what keeps people from turning to other search engines. Paying attention to algorithm changes and SEO updates is important, but it you’re already striving to provide an exceptional user experience, the changes shouldn’t upset your strategies too much.
Work with Google to give users what they want, but remember that Google might also be plotting against you. Use the company’s insights to learn how to delight users, and make sure your SEO is always working to build an audience of your own.
This week has been the week of “what took so long”: Instagram is finally taking an action on fake followers and Firefox is finally addressing its speed issues.
In other discussions of note, members provide tips for how webmasters can control for “dodgy” reviews, Google provides a not-so-clear clarification on duplicate content, and more!
The consensus on Webmaster World seems to be, “what took so long?” .
There’s the “follow the money” and there’s the “brand recognition”, and then there’s “pr by spam” … all detrimental to most communities.
Glad it’s happening. Wondering why it took so long. Maybe they needed the Fake Numbers to bolster their sale to FB?
One member also noted that if Facebook was keeping up with Webmaster World User Agent and Bot ID Forum, they may have done its sooner
With the latest update, Firefox promises faster speeds and reduction in memory usage. For those of us wondering, what took so long?
According to the scoop in this article here, it seems that the issue was moving from a single process architecture to a multi-process architecture. Nich Nguyen, Product VP at Firefox, urges former users to give Firefox another chance,
“if you’ve stopped using Firefox, give it a try again.”
In a most recent case of moving to HTTPS, member asks about changing from absolute to relative link references and if this may have a potential impact.
There’s no earthly reason ever to use absolute links for your own site, except in the rare case where some areas are http-only while others are https-only.
That’s assuming when you say “relative links” you mean ones beginning in / or //. If you mean links beginning in ../ then it’s time to have a talk with your host.
Changing references from absolute to relative has been a controversial topic in the past but on this thread, members seem to agree that the relative links are fine, so long as all the steps involved are done correctly.
Cre8asiteforums Member Tam asks about dealing with dodgy reviews on her website, and member earlperl discussing methods used by other providers to control reviews
In a recent tweet, Google’s Gary Illyes, when asked to define a tweet gave an appropriately short and sweet answer,
“Think of it as a piece of content that was slightly changed, or if it was copied 1:1 but the boilerplate is different”.
Unfortunately, this provides very little guidance for webmasters in terms of threshold, especially for enterprise websites where content is mostly dynamically driven.
Members discuss crawl delay and it is still used. Member NoOneSpecial clarifies that robots.txt is obsolete, since systems are sufficiently advanced to not require it. Clarify specifically in terms of Google, NoOneSpecial says that they ignore it.
This week has been rich in interesting discussions!
Forum members have been talking about Google’s planned Chrome Adblocker, 2017 State of the Internet Trends report, Google’s drop of NOODP meta tag support and what it means for marketers and marketing conferences.
Here are our highlights:
In a perplexing move, Google announced that they plan to release their own adblocker.
To quote the referenced Wall Street Journal (WSJ) Article,
“The ad-blocking feature, which could be switched on by default within Chrome, would filter out certain online ad types deemed to provide bad experiences for users as they move around the web.”
Some of the ad types that may be affected will be those defined by the Coalition For Better Ads, the article goes on to state. As far as the ad types affected, the WSJ specifies,
“According to those standards, ad formats such as pop-ups, auto-playing video ads with sound and “prestitial” ads with countdown timers are deemed to be “beneath a threshold of consumer acceptability.”
Webmasterworld members discuss the potential modus operandi of such an initiative. Member Mack suggests,
“Google may well assume they can take over the ad-block industry. Become the de facto entity then let it slide away”.
Member Lucy goes on to mention,
“But really, isn’t an ad blocker for Chrome just another of those inevitable bandwagon-following moves, like when MSIE incorporated multi-tabbed browsing because everyone else in the universe had been doing it for years?”
Member Londrum states,
“could be a good idea. every user who uses this is one less who uses another ad-blocker, which are 100 times worse for publishers because they usually just block absolutely everything, whether it’s unobtrusive or not “
Tangor quotes a comment from an article on The Register about the degree to which such an action makes Google a gatekeeper and “arbiter of taste”. The tread closes with a comment from engine that this initiative is said to come into effect in early 2018.
Over on Cre8asiteforums.com, a similar discussion around Googles new expected Adblocker emerged. Some users were definitely in favor, citing that this will help control some of the more overeaching uses of ads. Other members. Some interesting comments included iamlost, who stated
“This is just an extension of the AMP approach to let in Google ads and control other networks. Take it with a shaker full of salt.”
and EGOL went on to say that,
“I think that Google is building ad blocking into their browser to squelch the enormous number of crappy ads that appear on some websites. Some sites are gentle with their ads and some sites are a little less gentle, but a lot of sites are blasting ads in your face the whole way down the page” .
Grumpus also commented that,
“Googlebot has no person behind it. A web browser does. Why hire a bunch of human reviewers when you can get the entire world to do it for you for free?”
Engine posts the State Of Internet Trends Report, with commentary on how webmasters may be affected. Some highlights from the thread include
cre8pc shares recent survey results for what are the most important factors for marketers, when considering going to a conference.
For cre8pc, the split between genders was most interesting, noting that the most important factors for women are cost, speakers, and location and for men, it’s cost, networking and speakers, with men choosing networking over speakers.
Members, in general, say that they do get value out of going to conferences but sometimes cost and location are restrictions that keep them from going more often.
A new user removed the dates from word press blog post URLs and experienced a dip in traffic. The user experienced a dip in organic search traffic and is wondering if they should roll this initiative out for other sites as well.
Members speak to the level of risk in changing URL structures and provide tips for when structures changes, including:
In their last Webmaster Central Blog, Google notified users that they will no longer be be supporting NOODP tag.
For those of you who are thinking, “What is NOODP” tag, is it is the tag specifies to Google that you do not want them to use the Open Directory Project as a reference for what they include as snippets for your site and that this tag can be removed.
Join the discussions to contribute your thoughts and read more!
Does click-through-rate impact Google rankings? Is Google preparing to hit guest posting links again? Will we all be forced to migrate to HTTPS? This week’s forum discussions bring up important points. Join in!
According to researhers at Check Point, Malware “Judy” bypassed Google Bouncer, Play Stores proction system. The malware creates fraudulent ad clicks on the ads. The 50 apps in the Google Play Store with the malware, have been downloaded 36.5 million times.
Google released a blog post warning webmasters about how they link from guest posts and sponsored content. In addition to cautioning about links, they also cautioned about low quality content written without sufficient subject matter expertise.
Google does state:
“Google does not discourage these types of articles in the cases when they inform users, educate another site’s audience or bring awareness to your cause or company. “
But they do say the presence of guest posting or sponsored post links may affect the whole site, not just the page the low-quality link were detected on:
“When Google detects that a website is publishing articles that contain spammy links, this may change Google’s perception of the quality of the site and could affect its ranking. “
We wrote last week about Google Adhub (beta) announcement, which seeks to connect online clicks to offline purchases using credit card numbers. This discussion continues on a new thread, in Webmaster World this week, about how exactly Adhub may work.
For consumers, there’s a fear of a great deal more tracking going on, and privacy advocates have already spoken out. However, Google has gone on to allay fears by stating that no user-level data is accessible to advertisers, and only impression data is accessible. The credit card is used to track spending, but there’s no clear indication as to how that works, and who’s sharing what data. That one may yet run and run from a privacy point of view.
Last week, we reported on a discussion on SEOChat about if CTR is a ranking factor. When we checked again this week, a lively and heated discussion was still underway.
Here is a nice SMX presentation by Gary Illyes at 2016 SMX West
I know Moz tested this and the results were temporary, but to me this suggests that CTR is a ranking factor. If CTR spikes (because Rand tells everyone to click a link), rankings might rise. But when that spike goes away a week later (when Rand is no longer doing the experiment), Google would pick up on that and drop the site back down in the rankings. I would hypothesize that a sustained boost in CTR (say, from writing a better title) would yield a sustained boost in rankings.
We have discussed the topic over at Jim and Ann show. Check it out: Bounce Rate, Long Click, Pogo-Sticking, Dwell Time, Click-Throug
A cre8asiteforums member noted that some major ecommerce sites have not yet moved to SSL. Member hypothesize as to why this may be the case. EGOL comments:
In my mind, an info site doesn’t need to be https and the display pages on retail don’t have to be https… but Google is getting out a carrot and stick to force us into it.
Join the discussions above to contribute your thoughts
This week has seen some very interesting discussions around leveraging analytics for SEO, transparency in digital advertising, and of course some interesting technical SEO questions.
Member iamlost states a case for how analytics can be an agent of change for SEOs who are making do without keyword level metrics, proposing using the data to build detailed personas that can then be used for personalization, claiming that personalization and analytics can be an agent of change for SEOs.
iamlost then discusses an architecture that can be used to make this happen. Members then go on to discuss merits of using server logs vs analytics solutions and then cloud based analytics solutions, such as Google Analytics, and on premise solutions, such as Piwik.
Tagor shows a snippet of a recent Wall Street Journal article about a recent wave of refunds to advertisers due to a metrics bug. Tangor goes on to state that this is the 5th of such bug reported by Facebook since September.
Members discuss their own experiences about how ad clicks tend to fail to “make it across the register” to Google Analytics, and discuss the overall transparency and effectiveness of Facebooks ads.
Over on Threadwatch, Google stated that its adhub is able to connect online interactions to offline purchases by tracking credit card transactions.
Although the announcement has drawn criticism from privacy advocates, such a measure is sure to further bolster ad sales due to the increased visibility into how transactions are originated.
Members discuss how twitter personalization may work and how twitter is grouping users, with sometimes amusing results.
Twitter is now ‘adding’ perceived gender (from behaviour) to accounts that have declined to specify (last sentence in quote below).
Member optisite is looking to sunset a website that has lots of backlinks but is concerned about potential impacts. Members discuss best steps to take.
Member timemachined discusses a cases where a page with an outbound link going to another page is ranking for the the target queries of the other page.
Members discuss potentials next steps to fix this issue if internal linking is causing keyword cannibalization of content.
I think the keyword linking would perform as you intend if there was enough additional content and specific keywords/phrases supporting it.
1. Anchor text within a web page is said to be a signal of what that web page is about.
2. It’s possible your page is not structured correctly to communicate what it is about. Run your code through the W3C Validator and tick the box for the OUTLINE to visualize your page hierarchy.
Join the threads to follow the discussions and contribute your own thoughts!
On a strategic angle this week, members discuss to what degree you can trust proprietary link metrics in SEO tools.
Also, members talk about how the SEO game has changed from the 1990s to present.
In the tactical realm, members react to changes in Adsense, Facebook Algo, and consider a of if a site should roll back HTTPS if Google drops your traffic after the migration.
Members provide insights on what exactly Domain Authority metric means.
For context, Domain Authority refers to a properetry Moz metric to try to gauge the “authority” of a link. This metric is typically used for backlink analysis. Member prof.stan states that the DA metric very often does not match Google algorithms.
Members discuss the overall value of such a metric with the consensus being that such metrics are created by SEO providers and are not reflective of how Google may valuate links. In one of our episodes of the Jim & Ann Show, which you can check out here.
On the top, Jim says that you can’t trust Domain Authority (DA) metrics or “toxic link” metrics because none of the available tools have as deep of an index among other reasons.
In this thread, Kim Kruse Berg muses on what makes someone an official SEO, referencing Gary Illyes naming of what he thought was top SEOs.
Kim muses if much like software developers that now work from libraries, if SEOs are now just a form of Google marketer.
Member earlpearl, shares an account of how he is observing increased aggressive monetization of local listings. Member EGOL, shares that the same seems to be true for universal search but seems to imply that we cannot fault Google for seeking to maximize profits for shareholders.
Earlpearl continues on to share accounts of several SMB’s he works on and experience trying to drive leads on the organic and paid search front.
According to member Engine Facebook has been taking action to control low quality ads in the newsfeed, stating that some actions taken include
“using artificial intelligence to identify pages and sites that have similar characteristics”.
How will this affect advertisers overall?
Engine shares Googles new “Ad Balance” feature for collapsing unfilled ad units.
Publishers share their thoughts on the feature and their experiences from trying it out.
Given the news across the forum-scape that moves to HTTPS have resulted in a loss of organic search traffic, Webmaster world members weigh if its worth going back or not.
This week has been rich in interesting updates from search and social platforms.
Our forum users have actively discussing Google’s raters getting a cut in hours, Twitter teaming up with Bloomberg, Googlebot ignoring robots.txt directives and more!
Please click the links below for more details:
In this thread, an article is referenced about Google Raters, who work through at least two different subcontractor entities are seeing their 40 hour jobs being cut in half.
A close read of the article provides and interesting overview of the types of tasks that quality raters do – the scope of their tasks are across products and include more than you think!
On the Webmaster World link mentioned in the thread, members speculate about what the potential cause is and macro trends.
In this thread, members discuss the Larry Kim post stating that “value of a Google ranking down 37% in two years? Members seem agree in general that despite increased rankings, year over year traffic still seems to be down in many cases
The discussion highlight was NewDelhiSEO’s comment that:
“It will ultimately become a game of money till users starting hating Google for too much monetization. Until then SEO is loosing value every day.”
As the publisher race to social platform develops, Webmaster World members discuss this latest development. Members also consider other content deals that Twitter is working on.
Interesting development! Wonder what will happen to Periscope, Meerkat (still alive?) and Facebook live video?
Will direct partnerships with social platforms be something that all large publisher will need to consider for the future?
Member Chedders grapples with the best way to show a gridded experience on mobile.
Options discussed include
A member points out that dispite an effort to both nofollow and robots.txt. One member points out that that any and all links to those pages would need to be nofollowed, including navigation links. Other members remark on Googles “voracious appetite” to crawl even if they found a link just once.
Other members state that Google understands patterns in affiliate links and there should be no major concern around Google finding them. Still other members recommend removing the links from any sitemaps.
A day after the alleged major update, I thought it would make sense to highlight where we are at in the cycle.
Yesterday Google suggested their fear messaging caused 4.7% of webmasters to move over to mobile friendly design since the update was originally announced a few months ago.
The 4.7% of the websites Google pushed to go mobile friendly likely include some sites which would have been mobile friendly anyhow by virtue of being new sites on hosted platforms with responsive designs. But for the rest of the sites, was the shift worth it?
That is a tough question.
It is too early to tell.
The problem with going early is you eat the expense upfront, while the rewards are still unknown.
If you are spending your own time & money and you believe in what you are doing and the longevity of a project then it doesn’t matter too much if the rewards come slowly or never come. A sense of purpose & a sense of pride in your work is a form of payment.
However, if you are spending a client’s money & you ring a 5 alarm fire to rush to make some technical change & then see no upside after the much hyped announcement, that erodes client trust. If there is no upside and a huge drop in revenue, then the consultant looks like a clueless idiot burning money for the sake of it doing various make work projects.
A few years ago a Google rep stated Panda would be folded into the regular algorithms. Then recently we were told it was a near realtime. Then we were told it was something where updates needed to be manually pushed out & it is something Google hasn’t done in 4 months. If we trusted Google & conveyed any of these messages to clients, once again we looked like idiots. If we choose to invest client money based on the cycles and advice we are given, quite often that is a money incinerator.
Imagine dropping $30,000 on a link cleanup project where you remove links which were helping your Bing rankings but the Google update “coming soon” takes over a year to show up.
Invest money to lower your current income while you’re waiting for Godat.
So after Google made a big show of this pending mobile update by pre-announcing it, speaking about it at multiple conferences, comparing it to Panda and Penguin & stating it would have a bigger impact, sending out millions of warning messages via Webmaster Tools, etc etc etc .. when the big day came, did Google make the people who trusted them & invested in their advice look good?
Not so much.
Ayima recently launched a SERP flux pulse tracker tool which shows desktop and mobile flux side-by-side.
As you can see, nothing happened.
So far, no rewards. Maybe they will come. Though here is a hypothetical example where it could be very much NOT worth it for some publishers to go mobile friendly…
Any form of penalty (even a false positive) can become self-reinforcing. And many of the things which seem like they might help could cause harm.
Did you jump the gun or wait and see?