- The Human Guide
- Posts
- Google Pixel 10 should be called "AI phone"
Google Pixel 10 should be called "AI phone"
Plus: Learn about the 10 AI agents that will do everything for you (don't miss out on this)
The smartphone war just turned into an AI arms race.

Google just unveiled its Pixel 10 lineup and it’s not shy about why this matters. AI is now the centerpiece of its pitch, not just another feature.
Here’s everything you need to know:
Google’s Pixel 10 phones now center on Gemini, its AI assistant, doing everything from scheduling to real-time scene analysis.
A new feature called “Magic Cue” anticipates your needs before you ask like surfacing flight info mid-call.
The $799 base model climbs to $1,799 for the foldable Pro Fold, which touts an 8-inch display and heavy-duty hinge.
“Camera Coach” can suggest angles, lighting, and even merge photos into one where everyone looks their best.
Pixel users get a year of Google’s “AI Pro” subscription normally $19/month with access to advanced tools and storage.
Google isn’t outselling Apple or Samsung yet, but it’s making a clear bet: AI is the next wedge to shift the market.
With Apple’s Siri overhaul delayed until 2026, Google has a rare opening to lead with capability, not just marketing.
Most people don’t switch phones over one shiny feature but if AI genuinely makes daily tasks easier, it could chip away at brand loyalty. This isn’t just a tech launch; it’s Google betting that usefulness will outcompete habit.
Surya: the AI that reads the Sun’s next move

Image Credits: NASA
NASA and IBM just launched Surya, a foundation AI model trained to predict solar flares. It’s built on 9 years of sun data and it's already beating benchmarks.
Here’s everything you need to know:
Surya was trained on high-resolution data from NASA’s Solar Dynamics Observatory, covering nearly a full solar cycle.
It can predict solar flares up to two hours ahead, a leap in space weather forecasting.
Surya beat existing models by 16% in early tests, without requiring extensive labeling.
The model works by learning patterns from raw ultraviolet, magnetic, and velocity imagery of the Sun.
It’s open-source: scientists can access the model on HuggingFace and code on GitHub.
Use cases include protecting satellites, GPS systems, astronauts, and even power grids from solar disruptions.
NASA says the same architecture could power AI models in planetary science, Earth observation, and more.
This is a rare combo, groundbreaking science, critical real-world impact, and open-access tooling. Surya shows that foundation models aren’t just for text or chatbots. They might soon be the backbone of scientific discovery itself.
AI is finally learning to think in 3D

Image Credits: Microsoft
Microsoft Research just introduced MindJourney, a new way for AI to “mentally explore” 3D spaces. The breakthrough tackles a major weakness in vision-language models: understanding how objects relate in real-world space.
Here’s everything you need to know:
VLMs (like Gemini or GPT-4o) are great with static images but stumble with spatial questions like what’s behind you if you turn left.
MindJourney simulates 3D movement using a video-trained world model, letting AI imagine what different perspectives might look like.
Instead of brute-forcing every option, it uses a “spatial beam search” to focus only on the most promising paths.
The system improved VLM performance by 8% on spatial reasoning benchmarks with zero extra training.
It works by layering symbolic reasoning from VLMs with world models that understand motion and perspective.
This means smarter agents for robotics, AR/VR, smart homes and possibly tools for people with visual impairments.
Microsoft plans to expand the system to predict future changes in scenes bringing planning and vision together in one loop.
This feels like a foundational shift. Vision alone isn’t enough, spatial understanding is the missing link for real-world AI. MindJourney hints at a future where AI agents don’t just “see,” they move, test, and plan inside a mental map. That’s how humans think and it’s about time AI caught up.