2.03 - 8.03.2020
Many of the latest Google updates have been trying to deal with the problem of correctly assessing the expertize, authority and trustworthiness (E.A.T.) of information sources across the internet. While the internet is free for all content creators, it is also true that this freedom means anyone can publish almost anything. Search engines felt they had the obligation to downgrade results that went against scientific research or against other proven facts. Conspiracy theory websites, alternative medicine sites and other similar ones took a serious hit after this new approach to search results.
The issue has been that obviously wrong and misleading information was sometimes displayed in the featured snippets in the SERP. For a comical, yet worrisome example, see this article collecting One True Answers that are obviously false. And with the current tendency of Google to deliver an answer before any click on the SERP is made, the issue had to be dealt with. More and more search queries lead to zero clicks, especially in the case of searches that are formulated as questions. Google has greatly increased the number of featured snippets it displays, leading to the “normal” search results being pushed down, sometimes below the fold.
In this context, Lily Ray, from Path Interactive published the results of an interesting survey on how much do users trust their search results. Based on 1,100 respondents from the US, India and Europe, it gives a breakdown on user trust in some of the most sensitive areas: medical, financial, legal and news. Based on some of the graphs displayed, we would say that Google has managed to (re)gain much of the searchers confidence.
You can read the entire post on Moz.
In an interesting coincidence, an updated version of this research was published by Perficient Digital only one day after the study presented above. It also studies the effects of changes to featured snippets in the Google SERP, but does it with different instruments.
The study questions whether the data we believe we know about user behavior in Google Search is up-to-date. One important finding is the fact that about 1 in 3 searches on desktop leads to no click. On mobile, the numbers are staggering: 54.58% of searches lead to no-click. This is one of the implications of the evolution of featured snippets.
Also, according to the study, the results vary greatly, with a lot less no-click situations when the SERP displays ads.
Another fundamental find is that the CTR distribution between the top 10 results on Google depends on whether we have a branded or non-branded search. In the case of non-branded search queries, lower positions in the top 10 have a significantly higher chance of receiving clicks.
Read the whole research findings here.
Starting last week, there is the possibility to ask the Google Assistant to read pages aloud. There is no question that this update will be very useful both for visually impaired users, as well as for those who want to “read” the news while driving or doing some other activity that requires their visual attention. Webmasters don’t need to do anything in order for their sites to be ready for this change. They can block it using a “nopagereadaloud” tag, although there won’t be many reasons for doing that. The functionality is also available for apps, but only if developers add it.
Apple was known for their tough policies on ads in notifications, completely banning messages that resembled advertising messages. But on March 4 came the announcement that the tech giant is taking a step back on this policy. Users will still be protected, as they will have to explicitly consent to this in advance, for every app they use. In any case, this is a big change for all the companies that manage apps available in the Apple store. No doubt, future marketing strategies will take this into account.
More about this on The Verge.
In our previous weekly review, we were talking about the testing of LinkedIn stories. One week later, we are seeing similar news from Twitter. According to Buzzfeed News, the new feature being tested is called “Fleets”. It was to be expected that Twitter would also follow this trend.
The new feature will likely have a big impact on the platform. Users preoccupied with uncovering old tweets, in order to discredit or public shame the persons who wrote them, may have less to work with now. For website owners who share their content on Twitter, it will likely have less of an impact.
As it had happened the previous week as well, Visitor Analytics is doing extended recommendations for their users, in terms of other tools they can use for their websites. The customers of the company are website owners or webmasters, and the vast majority use Wix as a website builder platform. This article helps them find the right solutions for their business, by recommending Wix apps from a variety of categories: forms, stores, chat, logo makers and booking apps.
You may or may not be familiar with DuckDuckGo, a search engine that focuses on privacy of the users. Although it’s not yet close to challenging Google, it is capitalizing on the current trend for internet privacy and some stats show an impressive growth of their market share and overall internet presence. In simple terms, DuckDuckGo does not track users, nor do they allow any of their partners to do so.
With the launch of the Tracker Radar, DuckDuckGo is taking a risk, by making public a list of the most common cross-trackers they block. Also, in the company’s own words, “this data set is now publicly available to use for research and for generating tracker block lists. And, the code behind it is now open source”.
This is similar to a chef having to disclose his secret recipes. Basically, a big part of the way DuckDuckGo works is now accessible for free. The people behind the company seem to be aware of that, but they are saying they care more about spreading the knowledge needed for more internet privacy.
See the announcement here and the list of crosstracking domains here. FYI, Visitor Analytics is not on it, as we do website analytics without crosstracking.
Depending on what they understand by “manually”, this could have a big impact on a Google service that has already been under fire recently. Many have been complaining about GMB’s inability to deal with fake reviews and fake listings, for that matter. Recent updates to the local search algorithm have spurred controversy. Google My Business posts were never very popular. Now, users fear that the manual reviewing will slow down the process of editing your GMB profile. Webmasters running local businesses that rely on their GMB listings are likely to be disappointed again.
We've made changes to our photo and video content policy. All photos and videos are now reviewed before publication. If you're having issues adding photos, check out our photo criteria: https://t.co/XR21n7uM9Z
— Google My Business (@GoogleMyBiz) March 5, 2020
This is no longer big news, as most active websites are already being indexed based on how they function on mobile devices. Google is underlining again that website owners make sure they build responsive websites, that have the same content, structure and meta-content as the desktop versions. If you are not doing this already, make sure to check your website for potential mobile indexing issues. If there are any errors on crawling you will be notified in the Search Console. For this reason and for many others, having a Google Search Console account is fundamental for any website.
Read more on the official Google Webmasters Blog.
Sign up to Our Newsletter for Regular Nuggets. And don’t worry, we won’t tell sales.