Latest SEO News October 2025: Google, OpenAI & Reddit Updates
AI Search, Visual Results & Major Legal Moves
🧠 Google AI Mode Evolves with Visual Intelligence
Google began October by announcing a major update to AI Mode, which now interprets queries and delivers responses in a far more visual way. The system can process not just text but also images, enabling it to produce combined text-and-image results designed to “spark inspiration” and create a smoother, more conversational experience for users.
As part of this improvement, Google has also introduced a visual fan-out technique. Previously, AI Mode broke down user queries into multiple text-based subtopics; now, it performs this analysis visually too, considering metadata, context, and image regions. The end result is a dynamic grid of visual responses tailored to each search query.
🌍 AI Mode Expands to More Languages and Regions
Source: Hema Budaraju, Vice President, Product Management, Search – Google
On October 7, Google announced that AI Mode in Search is being rolled out to over 35 new languages and 40 new countries and territories, bringing total global availability to more than 200 regions – including many across Europe. The rollout will continue throughout the week.
This expansion is powered by Google’s advanced Gemini models, which enhance multimodal comprehension and natural language understanding. The technology enables users to ask complex questions conversationally in their own language. According to Google, AI Mode users are already submitting queries nearly three times longer than those typed into standard searches.
👟 Google’s ‘Try-On’ Feature Adds Shoes and Expands Globally
Source: Google
Google’s AI-powered virtual try-on tool – which already lets shoppers preview clothes on themselves – is stepping into new territory. As of October 8, the feature has expanded to include shoes, with users in the United States getting first access. The service will also roll out soon in Australia, Canada, and Japan.
This system uses AI to accurately interpret body shape and depth, creating a realistic visualisation of how shoes or clothes would appear. Users can try it by selecting the “try it on” option under a product listing, uploading a full-length image, and instantly seeing how the footwear looks on them. Google reports that users in the U.S. share try-on results far more often than standard product listings.
🧭 OpenAI Launches ChatGPT Atlas Browser
October 21 marked a new milestone for OpenAI with the launch of its integrated web browser, ChatGPT Atlas. The browser, which includes ChatGPT and Agent Mode built directly into the experience, is available now as a free download.
Described by OpenAI as “a browser built with ChatGPT,” Atlas aims to evolve ChatGPT into a super-assistant that understands context, supports everyday browsing tasks, and assists users in achieving their goals. The move strengthens OpenAI’s presence on the web and signals its ambition to evolve toward a complete operating system-like environment.
⚖️ Reddit Takes Legal Action Against Perplexity and SerpApi
In a high-profile move on October 22, Reddit filed a lawsuit against Perplexity, SerpApi, Oxylabs, and AWMProxy in the U.S. District Court for the Southern District of New York. The platform accuses the companies of illegally scraping Reddit data from Google Search results and reusing it to train AI models or resell it commercially.
The complaint alleges that these companies used hidden identities to bypass technical protections and conducted data scraping “at an industrial scale.” Reddit is seeking financial compensation, a permanent injunction, and a ban on the sale or use of data previously obtained this way.
While Reddit already licenses its data to Google and OpenAI, it claims these firms circumvented those deals. To prove this, Reddit reportedly created a hidden test post visible only to Google’s crawler – which later appeared in Perplexity’s search results, allegedly confirming that the company had scraped Google-sourced Reddit data.
💬 What the Experts Are Saying
- Barry Schwartz – “Google’s move towards more visual AI responses is a clear sign that search is becoming less about words and more about understanding human context.”
- Aleyda Solís – “The expansion of AI Mode to so many languages highlights Google’s intent to make conversational, multimodal search universal.”
- Lily Ray – “The Reddit lawsuit could reshape how data licensing works across AI and SEO platforms, especially for user-generated content.”
“AI is transforming search faster than most realise – and it’s not just about how we ask questions anymore, but how we experience the answers.”
— David Roche, The SEO Guide Book
❓ Frequently Asked Questions
What changed in Google AI Mode on 1 October 2025?
Google introduced a more visual approach to AI Mode. It can now understand queries visually as well as textually and respond with a mix of text and visual results designed to feel like a fluid, ongoing conversation that “sparks inspiration”.
What is Google’s visual fan-out technique?
It’s an evolution of query fan-out where AI Mode breaks a question into subtopics and issues multiple queries at once, now also analysing images (including regions, metadata, and surrounding context) to produce a visual grid of responses.
Does AI Mode now return visual answers as well as text?
Yes. AI Mode can interpret your request visually and reply with both textual and visual outputs, not just text-based results.
When and where did Google expand AI Mode language and location support?
On 7 October 2025 Google began rolling out AI Mode to 35+ new languages and 40+ additional countries and territories, bringing availability to over 200 worldwide, including many across Europe. The rollout continues over the week.
Which models power the expanded AI Mode experience?
Google’s latest Gemini models power the experience, providing advanced natural language understanding and multimodal capabilities so people can ask anything in their preferred language.
How long are queries in AI Mode compared with traditional search?
Google reports that users ask questions nearly three times longer in AI Mode than in traditional searches.
What’s new in Google’s try-on feature and where is it available?
From 8 October 2025, try-on adds shoes. Shoppers in the United States can try it now, with expansion to Australia, Canada, and Japan in the coming weeks.
How does Google’s try-on work for shoes?
Select “try it on” on a product listing, upload a full-length photo, and within moments you’ll see how the footwear (heels or trainers) may look on you. The AI perceives shape and depth to preserve natural subtleties.
What is ChatGPT Atlas and how do I get it?
ChatGPT Atlas is OpenAI’s web browser with ChatGPT (including Agent Mode) built in. It’s available now as a free download.
What is Reddit’s lawsuit about and who is named?
Filed on 22 October 2025 in the U.S. District Court for the Southern District of New York, Reddit alleges Perplexity, SerpApi, Oxylabs, and AWMProxy illegally scraped Reddit content indirectly via Google Search. Reddit seeks financial damages, a permanent injunction, and a ban on using or selling previously scraped data, and cites a hidden test post that later appeared in Perplexity results. Reddit already licenses its data to Google and OpenAI.


















