Google on How it Handles Sites With Too Many Ads

Digital Marketing

Post Tags

Google’s John Mueller weighs in on how sites with lots of ads are handled when it comes to ranking in search results.

This topic came up during the Google Search Central live stream on December 11 where Mueller addresses the following question:

“For some queries I see Google ranking websites which have a high amount of ad content, which is creating a poor search experience. How is Google dealing with those sites?”

In response, Mueller describes several factors that can determine how sites with many ads are dealt with in search results. On rare occasions they may be removed from search, but only if extreme conditions are met.

Mueller also goes over why Google will choose to keep sites in its index when they’re in obvious violation of webmaster guidelines. He had much to say on the subject, and here’s everything summarized for you.

Mueller Addresses Site With Too Many Ads in Search Results

Mueller couldn’t speak to the handling of any specific site as no example was provided, so he spoke more broadly to the handling of sites with a poor user experience.


Continue Reading Below

He points to several algorithm updates that can impact the how sites with a poor user experience are ranked:

  • Page Layout Algorithm: Impacts sites with too many ads above the fold. Launched in 2012.
  • Page Speed Algorithm: Sites with an abundance of ads may be impacted if pages are slow to load. Launched in 2018
  • Core Web Vitals: Specifically targets sites with a suboptimal user experience. Launches in May 2021.

Mueller didn’t mention this one, but I’ll add that Google’s intrusive interstitial penalty could also impact sites with an excessive amount of ads.

“It’s hard to say without any examples, but there are various things that we do take into account with regard to the user experience side of things. So on the one hand we did, I think a couple years ago, an update where the above the fold content is something we weighed a little bit more.

So that’s something where if there is a lot of ad content above the fold perhaps it might be affected by that. There are other updates that have happened in the past with regard to speed. There’s the Core Web Vitals which is going to be launched in May, with regards to ranking in search results, which also helps there.”


Continue Reading Below

Pages With Poor User Experience Can Still Rank

Mueller goes on to explain that pages with a bad user experience can still rank if they happen to be extremely relevant for particular queries.

“The other thing also to keep in mind is we use a lot of different factors to determine ranking in the search results to try to understand what is relevant for users in the individual times.

It can very well happen that a page is extremely relevant in some regards but still has a really bad user experience, and we will still show it in the search results. And sometimes we show it highly in the search results.”

Mueller gives a common example of queries that include a website’s name. Google will still rank the site in search results, even if it has bad UX, because that’s what the user is look for.

“So just because a page has a bad user experience, and if we were to take that bad user experience into account, it doesn’t mean we would never show that page.

This is something that is really common, for example, if you search for a website’s name then you would expect to find that website even if it is doing weird things and has a really bad user experience. You would still expect to find that website.

And there’s a broad range of different kinds of queries and understanding of relevance that flow into things like that. With that said it’s always possible to find these kinds of pages in search results.”

Google Rarely Removes Sites For Having Bad UX

No matter how awful a site’s UX might be, Google will not remove the site from search results solely for that reason.

Manual removals are usually reserved for sites that are irrelevant and don’t offer anything unique.

“It’s extremely rare for us to manually go in and say we will completely remove this website from search so it never shows up for any queries.

We usually reserve that specifically for cases where the whole website is essentially irrelevant. Where the whole website is just scraping content from the rest of the web, and there’s nothing unique of value at all on the website, then that’s something where the webspam team might go in and say this is a pure spam website and there’s nothing of value here.

Then we’ll remove that [site] from search but for [every other site with bad UX] we can still show it, and in some cases the other factors come in and play a larger role.”


Continue Reading Below

Mueller adds his opinion to Google’s handling of sites with bad UX, saying he thinks it’s important they’re kept in search results.

He describes instances where websites can be egregiously difficult to navigate simply because the site owner doesn’t know any better. Oftentimes those sites belong to legitimate businesses, so that’s why Google isn’t too heavy handed with the ban hammer.

“I think this is also important because a lot of people don’t know everything they should be doing on a website. They don’t know all of the details of what is important or what they should not do. They don’t know those tricks they heard from friends – are they really bad or are they just kind of bad? Do they kind of work sometimes?

They end up doing lots of weird things, and all of these websites that do things which are kind of suboptimal, as an expert you might look at it and say “oh they should not be doing this, this is so clearly black hat against the webmaster guidelines.”

They might not know and they might be a legitimate small business and they just have their website like that. And in cases like that I think we should definitely continue to show that website, it’s not like it’s completely irrelevant for users maybe they’re just doing things in ways that they don’t know any better.”


Continue Reading Below

Hear Mueller’s response in the video below:

Source link

Comments are closed.