Tuesday, February 24, 2026

How Founders Are Turning...

The Power of Your Voice in Marketing Your voice is your most valuable and...

WP Engine Vs Automattic...

Introduction to the Case WP Engine has filed a Second Amended Complaint against Automattic...

Facebook Ads for Blog...

Facebook Ads is a powerful tool that can help bloggers increase their website...

How to Measure the...

YouTube ads are an effective way to drive traffic to your website, increase...
HomeSEOGoogle Explains Why...

Google Explains Why They Need To Control Ranking Signals

Introduction to Google Ranking Factors

Google’s Gary Illyes recently answered a question about why Google doesn’t use social sharing as a ranking factor. This answer provides insight into how Google approaches external signals and why it’s unlikely to use them in the future.

The Interview with Gary Illyes

Kenichi Suzuki, a respected Japanese search marketing expert, published an interview with Gary Illyes. In the interview, Illyes was asked about the role of social media views and shares in Google’s ranking algorithm. The question was posed by Rio Ichikawa, also of Faber Company, who inquired whether social media metrics are used as ranking signals.

Are Social Media Shares or Views Google Ranking Factors?

Gary Illyes’ response was straightforward: social media views and shares are not ranking signals, and it’s unlikely they will be in the future. The reasoning behind this decision is rooted in Google’s need to control its own signals. External signals, such as those from social networks, are not reliable because they can be easily manipulated.

- Advertisement -

Why External Signals Are Unreliable

Illyes explained that if Google were to use external signals, it would be difficult to determine whether the signals are legitimate or inflated. This lack of control makes external signals untrustworthy. This perspective is not new; Google has expressed similar concerns about other signals that can be easily manipulated.

Easily Gamed Signals Are Unreliable for SEO

The SEO community often debates the role of various signals in Google’s algorithm. However, signals that can be easily manipulated are unlikely to be used. For example, structured data can be used to make websites eligible for rich results, but it’s not a ranking factor. Additionally, misusing structured data can lead to manual actions.

Examples of Unreliable Signals

Other examples of unreliable signals include the LLMs.txt protocol proposal, which was deemed unreliable by Google’s John Mueller. Mueller compared the LLMs.txt protocol to the keywords meta tag, which was often misused by SEOs. The author byline is another example of a signal that was promoted as a way to show authority but was easily abused.

Key Takeaway from Gary Illyes’ Answer

The key statement in Illyes’ answer is that Google needs to be able to control its own signals. This statement highlights the importance of focusing on signals that are within Google’s control and ignoring those that can be easily manipulated.

Conclusion

In conclusion, Google’s decision not to use social media views and shares as ranking factors is rooted in its need to control its own signals. The SEO community should take note of this perspective and focus on creating experiences that users love, rather than trying to manipulate external signals. By doing so, SEOs can create more effective strategies that align with Google’s goals.

- Advertisement -

Latest Articles

- Advertisement -

Continue reading

Sam Altman Says OpenAI “Screwed Up” GPT-5.2 Writing Quality

Write an article about Sam Altman said OpenAI “screwed up” GPT-5.2’s writing quality during a developer town hall Monday evening. When asked about user feedback that GPT-5.2 produces writing that’s “unwieldy” and “hard to read” compared to GPT-4.5, Altman was...

WooCommerce May Gain Sidekick-Type AI Through Extensions

Write an article about WooCommerce is approaching a turning point in 2026 thanks to the Model Context Protocol and the convergence of open source technologies that enable it to function as a layer any AI system can plug into,...

Google Shows How To Check Passage Indexing

Introduction to Googlebot and HTML Size Limits Google's John Mueller was asked about the number of megabytes of HTML that Googlebot crawls per page. The question was whether Googlebot indexes two megabytes (MB) or fifteen megabytes of data. Mueller's answer...