Tuesday, February 24, 2026

Blogging Like a Pro:...

Blogging is a great way for teens to express themselves, share their ideas,...

NotificationX WordPress WooCommerce Plugin...

Vulnerability in NotificationX FOMO Plugin for WordPress and WooCommerce Sites The NotificationX FOMO plugin,...

Maximizing Your Social Media...

Maximizing your social media reach is crucial in today's digital age, especially if...

The Ultimate Guide to...

Attracting new visitors to your blog can be a challenging task, especially when...
HomeSEOGoogle's Preferred Sources...

Google’s Preferred Sources Tool Is Jammed With Spam

Introduction to Google’s Preferred Sources Tool

Google’s Preferred Sources tool is designed to allow users to personalize their news feed by selecting their favorite websites to appear more frequently in the Top Stories feature. This feature gives users control over the news outlets they see, rather than relying solely on Google’s ranking system. The goal is to provide users with a more tailored experience, allowing them to see more content from their preferred sources.

How the Preferred Sources Tool Works

The Preferred Sources feature enables users to choose which news sources they want to see more often. This doesn’t block other sites from appearing, but rather personalizes the user’s experience to reflect their chosen sources. By giving users more control over their news feed, Google aims to improve the overall user experience.

The Issue with Similar Domains

However, an issue has arisen with the Preferred Sources tool. It appears that people are registering domains that are similar to those of well-known websites. This is often done by domain squatting on an exact match to a domain name using a different top-level domain (TLD). For example, if a popular domain name is registered with a.com or.net TLD, domain squatters will register the same domain name using a.com.in or.net.in TLD.

- Advertisement -

Examples of Copycat Sites

There are several examples of copycat sites appearing in the Preferred Sources tool. A search for a popular SEO tool surfaces the correct domain, but also a parked domain in the Indian.com.in country-code TLD. Similarly, a search for the New York Times surfaces a parked domain in the Indian.com.in TLD. These copycat sites often feature low-quality content, such as articles about payday loans, personal injury lawyers, and luxury watches.

Screenshots of Copycat Sites

Screenshots of these copycat sites show that they are often poorly designed and feature irrelevant content. For example, a screenshot of a site claiming to be the Huffington Post features articles about payday loans and personal injury lawyers. Another screenshot shows a site search for a copycat domain, which reveals that only the home page is indexed by Google.

What’s Going On?

It’s unclear how these copycat domains are getting into the Preferred Sources tool. It’s possible that SEOs are registering these domains and submitting them to the tool, or that Google is automatically picking them up. Either way, it’s clear that the tool is surfacing low-quality sites that are not relevant to users’ interests.

Conclusion

The Preferred Sources tool is a great idea in theory, but it’s clear that it needs some work. By surfacing copycat sites and parked domains, Google is not providing users with the best possible experience. To improve the tool, Google needs to find a way to filter out these low-quality sites and ensure that only legitimate websites are appearing in the Preferred Sources tool. This will help to improve the overall user experience and provide users with a more personalized and relevant news feed.

- Advertisement -

Latest Articles

- Advertisement -

Continue reading

Sam Altman Says OpenAI “Screwed Up” GPT-5.2 Writing Quality

Write an article about Sam Altman said OpenAI “screwed up” GPT-5.2’s writing quality during a developer town hall Monday evening. When asked about user feedback that GPT-5.2 produces writing that’s “unwieldy” and “hard to read” compared to GPT-4.5, Altman was...

WooCommerce May Gain Sidekick-Type AI Through Extensions

Write an article about WooCommerce is approaching a turning point in 2026 thanks to the Model Context Protocol and the convergence of open source technologies that enable it to function as a layer any AI system can plug into,...

Google Shows How To Check Passage Indexing

Introduction to Googlebot and HTML Size Limits Google's John Mueller was asked about the number of megabytes of HTML that Googlebot crawls per page. The question was whether Googlebot indexes two megabytes (MB) or fifteen megabytes of data. Mueller's answer...