Check out the latest news for Nanome AI Early Access. Learn More
Back to Blog

Looking Back at 2024 and What’s Next for Nanome in 2025

December 17, 2024

Reflecting on Another Year at Nanome

As we close out another year at Nanome, we’ve been reflecting on just how far we’ve come and what an incredible journey it’s been. From new plugins and patches, the emergence of Apple’s Vision Pro, to launching Nanome AI, Nanome 2.0 & MARA, this year brought both milestones and glimpses into the future of scientific discovery.

Starting Strong with New Plugins

We kicked off the year by expanding our platform’s versatility through new plugins like the Scene Viewer and the Jupyter Cookbook. The Scene Viewer simplified how our users curate and present molecular narratives, while the Jupyter Cookbook Plugin blended Python’s adaptability with the immersive power of XR. Scientists could now open multiple Python notebooks right inside Nanome, share algorithms with colleagues, and integrate the computational tools they care about—all within virtual and mixed reality environments.

A Sneak Peek into the Future: Apple Vision Pro

We were thrilled about Apple unveiling the Vision Pro. From day one, we saw how its eye, hand, and voice-based interfaces could transform immersion and usability. We built prototypes like NanoSpin—a delightful prototype that showcased molecules spinning in bounded volumes—and even ported our classic Calcflow to Vision Pro, reimagining 3D math visualization for this new era. The Vision Pro demonstrated that true spatial computing is finally here, and we’re proud to have taken those first steps on the platform.

Refining Collaboration and User Experiences

Mid-year, we rolled out patch 1.24.6, focusing on one of the most essential aspects of scientific work: collaboration. This update offered simpler room setups, better participant controls, and more seamless meeting flows, making it easier for teams around the world to walk into a virtual space and get right to work. Throughout these improvements, we learned that small changes to workflows can have a huge impact on productivity and creativity.

MARA: Our Scientific Copilot Emerges

The public beta launch of MARA (Molecular Analysis and Reasoning Assistant) was another exciting milestone. Serving as an AI-driven scientific coworker, MARA made it easier to analyze molecules, run specialized tools, and create custom workflows—all with just a few prompts. This wasn’t about adding just another feature; it was about rethinking what’s possible when AI and XR unite to streamline scientific innovation.

A Leap Forward: Nanome AI and Nanome 2.0

Toward the end of the year, we introduced Nanome AI Early Access and Nanome 2.0—a true culmination of nearly a decade of vision. Nanome 2.0 gave us a reimagined platform built from the ground up for intuitive collaboration, robust data handling, and richer visuals. Paired with the AI-driven magic of MARA, we’re now closer than ever to delivering an interface that feels like a real-life “JARVIS” for scientists, helping them tackle complex challenges and spark fresh breakthroughs. It’s the strongest foundation we’ve ever built and the perfect launchpad for whatever comes next.

What to Expect from Nanome in 2025

Feature-Fueled Expansion in Nanome 2.0 & MARA

In the coming year, we’ll be rolling out a wide range of new features designed to unlock fresh scientific workflows. Nanome 2.0 is as intuitive as it is powerful, and we can’t wait to support even more ways for you to explore, innovate, and accelerate your discoveries.

Deeper Integration with MARA

By blending MARA’s extensive toolkit directly into Nanome 2.0, we’ll be amplifying this cohesive platform where raw data seamlessly transforms into actionable insights. This integrated ecosystem will empower users to tackle complex challenges without missing a beat.

Industry Trends to Watch in 2025

Rising Reasoning Models and Tool Use

We’re closely following how models like **OpenAI’s O1** evolve to leverage tools. Once these reasoning assistants seamlessly integrate real-world utilities, the scope of AI-powered analysis will broaden, making research more adaptive and context-aware.

Competitive Reasoning Platforms from Tech Giants

The next wave of reasoning models from Meta, Google, and Anthropic promises a surge in innovation. As they vie for prominence, expect cutting-edge features and integrations that push AI-driven scientific workflows to exciting new heights.

More Accessible Apple Vision Headsets

Rumors of a more affordable Apple Vision device are sparking excitement, as it could be the catalyst that brings spatial computing to the mainstream. Greater accessibility means more researchers can leverage immersive XR to explore and unravel molecular complexities like never before.

What’s particularly intriguing, for our team at least, is how Apple will cut costs. Will they ditch EyeSight, the controversial front-facing display that shows the user’s eyes? Or perhaps reduce the onboard processing power and shift computation to the iPhone?

Whatever approach they take, one thing is clear: with another trillion-dollar tech giant pushing the boundaries in XR, the market is only heating up—and that’s great news for Nanome!

Meta’s Orion AR Glasses in Production & Speculations on Meta Quest Pro 2

As Meta moves its Orion AR glasses toward commercial release, lightweight and highly portable AR will find its way into everyday lab environments. Imagine having critical data and simulations at your fingertips, wherever you’re working. If Meta rolls out a Quest Pro 2 focused on productivity and mixed reality, it will raise the bar for immersive research platforms. Such an advancement could further validate and refine XR’s role as an essential component of scientific innovation.

Looking Back, Looking Ahead

This year, we moved from concept to reality on multiple fronts. We turned notebooks into living tools, tested the limits of spatial computing, refined how teams interact in XR, and brought AI onto the molecular stage. There’s a certain nostalgia in revisiting how each step—small or bold—brought us closer to our mission: to empower scientists with the most intuitive, immersive, and powerful platform imaginable.

As we stand at this year’s end, we do so with gratitude for our community of users, partners, and dreamers who helped shape these achievements. The journey continues, and if this year taught us anything, it’s that the future of science will be more accessible, more interactive, and more collaborative than ever before. We can’t wait to take those next steps forward with all of you.