TLDR
Amid the rise of AI-generated summaries on search engines, a new challenge for brands is “Neural Howlround,” a term coined in a 2025 study by Ohio State University. It describes a bias in large language models (LLMs) where specific websites are favored in AI summaries, even if other sites have more relevant information. This phenomenon is critical because AI summaries now capture the highest share of search impressions and have a CTR of 38.9%, nearly matching the top organic search result. To combat this, marketers must craft content that caters to their ideal customer profile, utilize structured data and FAQs to increase the likelihood of AI retrieval, and follow core SEO rules like having experts cite their content.
A recent study by First Sage Page highlights the importance of AI-generated summaries of Google SERPs for marketers. The research shows that AI summaries now capture the highest share of search impressions for a specific keyword. The average CTR for AI summaries now stands at 38.9% which is close to the Rank 1 position of organic results. A higher number of clicks does not only mean more traffic to your brand’s website, but it is also crucial for your brand to establish legitimate authority and a favorable bias in the customer’s mind.
Although it has now become a KRA for every brand’s SEO team to have website content referenced by Google Gemini for creating AI summaries around keyword searches done by a user who fits their Ideal customer profile, the challenge lies in the fact that if the website falls into the trap of Neural Howlround.
What is Neural Howlround?
A 2025 study by Ohio State University initially coined the term. Their analysis shows that Large language models (LLMs)-driven AI systems may exhibit an inference failure mode. A self-reinforcing cognitive loop where specific highly weighted inputs become dominant, leading to entrenched response patterns resistant to correction.
The phenomenon highlights an interesting bias in LLMs, resulting in a notable preference for specific website URLs when generating AI summaries. Even if your website’s URL has more relevant and fact-checked information compared to competitors’ websites, it may still get favourable treatment by the model in question.
In case your website falls into this trap, you suffer from a low selection rate, which means as a percentage of times our website URL is chosen among the available options on the internet to generate AI summaries by search engines for a specific user query. In fact, your competitor can also use it to create a false narrative about your brand.
Neural Howlround in traditional search
As Search marketers, you may not be facing this phenomenon in the age of AI, and if you take a deep dive into the memory lane, you have seen a similar phenomenon in the traditional search too.
We all know that Search engines like Google reward freshness, meaning the sites crawled and updated more frequently often get higher visibility in time-sensitive queries. A similar kind of phenomenon you have noticed for Country-code top-level domains like .in,.jp, etc, is given more preference in localized searches. It is more of a structural bias of search engine algorithms.
We might have seen popularity and brand bias built into the older Google ranking algorithm. Established brands are getting favourable treatment in rankings, even when their content isn’t necessarily stronger, making it challenging for smaller brands to make bigger breakthroughs from organic rankings.
What is the way forward for SEO marketers?
Although corporations can continue to deny the biases that exist in their systems, we still need to find a way to improve.
Treat with clarity
Clarity means to make deliberate choices about inclusion and exclusion. A clear idea of what the ideal customer profile for your brand is, craft your website content particularly to their niche and rules of the game, such as structured data, including summaries for every piece of content, and, if possible, FAQs in your content structure, to increase the chances of AI retrieval.
Be present in every stage of the consumer buying decision.
You can’t have a content approach directed towards the bottom of the funnel of the consumer buying decision, or focus only on time-bound trends (understand that you are not a news website). Consider having a layered structure in your website content strategy to satisfy every search intent. Marketers can focus on designing guides, tools, and a series of content on consumer education to build credibility, which may increase their likelihood of avoiding AI biases.
Read Also: How Marketers Optimize for Evolving Google Search Results Pages in 2025?
Follow the thumb rules of GEO.
According to the core thumb rules, which are very important in this era, it is essential to look for possible citations by experts, participate in user-generated conversations, prefer using structured data, keep your content fresh and updated, and allow AI indexing through a robots.txt file.
We don’t know the actual near-time risk, but we can expect LLMs to improve, and you get a viable spot on AI Summaries on SERPs. One needs to stay prepared otherwise, and you need to be present in enough places for AI models to treat it as consensus.