Connect with us

Tech

Writer drops mind-blowing AI update: RAG on steroids, 10M word capacity, and AI ‘thought process’ revealed

Published

on

Writer drops mind-blowing AI update: RAG on steroids, 10M word capacity, and AI ‘thought process’ revealed

We want to hear from you! Take our quick AI survey and share your insights on the current state of AI, how you’re implementing it, and what you expect to see in the future. Learn More


Writer, a leading enterprise AI platform, has rolled out a suite of powerful enhancements to its artificial intelligence chat applications, announced today at VB Transform. The sweeping improvements, which include advanced graph-based retrieval-augmented generation (RAG) and new tools for AI transparency, will go live across Writer’s ecosystem starting tomorrow.

Both users of Writer’s off-the-shelf “Ask Writer” application and developers leveraging the AI Studio platform to build custom solutions will have immediate access to these new features. This broad rollout marks a significant leap forward in making sophisticated AI technology more accessible and effective for businesses of all sizes.

At the heart of the upgrade is a dramatic expansion in data processing capabilities. The revamped chat apps can now digest and analyze up to 10 million words of company-specific information, enabling organizations to harness their proprietary data at an unprecedented scale when interacting with AI systems.

Unleashing the power of 10 million words: How Writer’s RAG technology is transforming enterprise data analysis

“We know that enterprises need to analyze very long files, work with long research papers, or documentation. It’s a huge use case for them,” said Deanna Dong, product marketing lead at Writer, in an interview with VentureBeat. “We use RAG to actually do knowledge retrieval. Instead of giving the [large language model] LLM the whole library, we’re actually going to go do some research, pull all the right notes, and just give the LLM the right resource notes.”


Countdown to VB Transform 2024

Join enterprise leaders in San Francisco from July 9 to 11 for our flagship AI event. Connect with peers, explore the opportunities and challenges of Generative AI, and learn how to integrate AI applications into your industry. Register Now


A key innovation is Writer’s graph-based approach to RAG, which maps semantic relationships between data points rather than relying on simpler vector retrieval. According to Dong, this allows for more intelligent and targeted information retrieval:

“We break down data into smaller data points, and we actually map the semantic relationship between these data points,” she said. “So a snippet about security is linked to this tidbit about the architecture, and it’s actually a more relational way that we map the data.”

Peering into the AI’s mind: Writer’s ‘thought process’ feature brings unprecedented transparency to AI decision-making

This graph-based RAG system underpins a new “thought process” feature that provides unprecedented transparency into how the AI arrives at its responses. The system shows users the steps the AI takes, including how it breaks down queries into sub-questions and which specific data sources it references.

“We’re showing you the steps it’s taking,” Dong explained. “We’re taking kind of like a maybe potentially a broad question or not super specific question which folks are asking, we’re actually breaking it down into the sub-questions that the AI is assuming you’re asking.”

May Habib, CEO of Writer, emphasized the significance of these advancements in a recent interview with VentureBeat. “RAG is not easy,” she said. “If you speak to CIOs, VPs of AI, like anybody who’s tried to build it themselves and cares about accuracy, it is not easy. In terms of benchmarking, a recent benchmark of eight different RAG approaches, including Writer Knowledge Graph, we came in first with accuracy.”

Tailored AI experiences: Writer’s new “Modes” streamline enterprise AI adoption

The upgrades also introduce dedicated “modes” — specialized interfaces for different types of tasks like general knowledge queries, document analysis and working with knowledge graphs. This aims to simplify the user experience and improve output quality by providing more tailored prompts and workflows.

“We observe customers struggling to use a fits-all chat interface to complete every task,” Dong explained. “They might not prompt accurately, and they don’t get the right results, they forget to say, ‘Hey, I’m actually looking at this file,’ or ‘Actually need to use our internal data for this answer.’ And so they were getting confused.”

Industry analysts see Writer’s innovations as potentially game-changing for enterprise AI adoption. The combination of massive data ingestion, sophisticated RAG, and explainable AI addresses several key hurdles that have made many businesses hesitant to widely deploy LLM-based tools.

The new features will be automatically available in Writer’s pre-built “Ask Writer” chat application, as well as in any custom chat apps built on the Writer platform. This broad availability could accelerate AI integration across various enterprise functions.

“All of these features – the modes, thought process, you know, the ability to have built-in RAG – are going to make this entire package of quite sophisticated tech very usable for the end user,” Dong said. “The CIO will be kind of wowed by the built-in RAG, but the end user – you know, an operations team, an HR team – they don’t have to understand any of this. What they’re really going to get is accuracy, transparency, usability.”

As enterprises grapple with how to responsibly and effectively leverage AI, Writer’s latest innovations offer a compelling vision of more transparent, accurate, and user-friendly LLM applications. The coming months will reveal whether this approach can indeed bridge the gap between AI’s immense potential and the practical realities of enterprise deployment.

Continue Reading