MarkLogic hero hex top

MarkLogic for
Generative AI

Securely leverage all your enterprise data to build scalable and trustworthy generative AI (GenAI) applications with an agile architectural pattern.

Webinar

Evolving Your Data Architecture for Trustworthy Generative AI

Watch Webinar

Amplify GenAI with Your Data. Empower Your Business with Knowledge.

Elevate skills and productivity across the organization with AI-enhanced applications. Progress® MarkLogic® allows you to provide large language models (LLMs) with your domain-specific knowledge to democratize access to information across your organization.

Accurate
Responses

Augment large language models with your enterprise information and your rules to drive validity of generative AI responses.

Trusted
Data

Create a rich, trustworthy content source for any generative AI model to surface insights with repeatable confidence.

Elevated Experiences

Enhance search with securely personalized recommendations and human-digestible insights for better user experiences.

Intelligent Applications

Build fact-based applications that can interpret intent and enable smarter operations to multiply your workforce productivity.

Retrieval Augmented Generation (RAG) with Knowledge Graphs

The effectiveness of generative AI depends on the data it uses to generate results. Graph Retrieval Augmented Generation (RAG) allows you to augment generative AI prompts by securely incorporating enterprise data, guiding the model to generate contextually relevant responses and reducing hallucinations and data biases for more accurate AI outputs. Connecting LLMs and knowledge graphs empowers generative AI models with access to private data and a deep understanding of that data to retrieve fact-based information about your enterprise.

Multi-Model Database Store and Query Documents​ 1 1 Ingest content as-is into a multi-model database, curate and harmonize it into the relevant model required for both the LLM and other downstream applications, making it extensible for future applications and services. 2 Semantically tag your private content using classification based on ontologies, taxonomies, entity and fact extraction— all within a semantic platform that helps you turn your content into a knowledge graph of your data. 3 Using a semantic platform, tag your queries against this knowledge graph and identify relevant content within your data, relating to the subject and context of the queries. 4 Create a prompt using our Augmented Prompt Generation approach, including a hybrid search against the semantic knowledge graph, increasing relevancy and accuracy for the user.​ 5 Pass this prompt to the generative AI and get an answer that is then re- validated against the knowledge graph and the reference documents to promote accuracy. Semantic Platform Knowledge Capture Generative AI 4 5 2 3 User Queries Ingest content as-is into a multi-model database, curate and harmonize it into the relevant model required for both the LLM and other downstream applications, making it extensible for future applications and services. 1 Ingest Content Semantically tag your private content using classification based on ontologies, taxonomies, entity and fact extraction—all within a semantic platform that helps you turn your content into a knowledge graph of your data. 2 Multi-Model Database Enterprise Private Documents Using a semantic platform, tag your queries against this knowledge graph and identify relevant content within your data, relating to the subject and context of the queries. 3 Semantic Platform Knowledge Capture Generate a prompt from relevant content, the semantic knowledge graph and the user’s query. 4 User Queries Pass this prompt to the generative AI and get an answer that is then re-validated against the knowledge graph and the reference documents to promote accuracy. 5 Generative AI

Business Benefits

  • Delivers accurate answers by providing the right enterprise content as context for the GenAI
  • Reduces hallucinations, bias and reasoning errors with curated enterprise knowledge
  • Optimizes results by interchanging multiple AI models inexpensively across the same reusable enterprise content
  • Provides up-to-date responses by handling dynamic data sources in real time
  • Helps secure your content to reduce the risk of hacking and data loss through robust security controls

Technical Benefits

  • Holds real-time, semantically relevant enterprise data
  • Enables rapid integration of enterprise information
  • Has tunable, use-case dependent relevancy
  • Retrieves knowledge independent of generative AI models
  • Provides data-quality tools for curation, mastering and aggregation
  • Adheres to enterprise standards for security, governance and lineage

Webinar

The Next Frontier for the Enterprise: The Human-AI Collaboration

Foundational data quality is a prerequisite for the reliability, accuracy and relevancy of AI-driven solutions. From empowering workforces with new skills to harnessing AI for data-driven decision-making, hear about the human role in steering AI’s integration and evolution within businesses.

Watch webinar

Flexible, Secure and Connected Data Ecosystem for AI

The Progress MarkLogic platform combines multi-model data management with real-time, relevance-based search and semantic capabilities to provide an adaptable, secure foundation for your generative AI solutions.

Generative AI Model Independence

Easily switch generative AI models to adapt to new business requirements or take advantage of technology advancements. With the MarkLogic platform’s flexible data model, you can use the same memory and data against multiple generative AI models to enable a variety of use cases without incurring the extra costs of re-indexing.

Learn more
Smart Data Curation

Harness the wealth of your enterprise content and provide a rich information set to your generative AI models. The MarkLogic platform integrates diverse data sources and formats, including structured and unstructured data, to create a curated, quality and consistent data source for your AI-enhanced applications with easy model-driven mapping, entity modeling and smart mastering.

Learn more
Human-Readable Knowledge Repository

Improve the robustness of LLM responses with a semantic knowledge graph as your AI model’s external long-term memory. MarkLogic allows you to store, index and search RDF triples and enrich your data models with new semantic relationships and metadata to provide enhanced context for your AI systems.

Learn more
Real-Time Hybrid Querying

Power natural language search and human-centric experiences with multi-model, real-time data querying and data service APIs. The MarkLogic native search engine identifies the most relevant information to answer a user question with comprehensive indexing, relevance ranking, co-occurrence and proximity boosting and returns high-confidence results.

Learn more
Native Vector Operations (Early Access)

Significantly improve document search relevance to maximize the retrieval effectiveness of your RAG systems. The MarkLogic native vector operations capabilities allow you to store vector embeddings close to your data in JSON or XML format and perform large-scale similarity searches to effectively refine the top search results for even greater accuracy, prioritizing the content that best matches the user queries.

Learn more
Enterprise Standards and Trust

Take AI projects from incubation to production with robust security and unmatched scalability. The MarkLogic advanced security controls tightly couple role-based and query-based access to the content used by GenAI to generate the answer users get, helping to elevate data privacy. Automated lineage and provenance explain how generative AI models reach conclusions and reference the sources generating the response to build trust.

Learn more

Human Expertise at a Machine Scale with Semantic Technologies

Progress MarkLogic and Progress® Semaphore® can enhance generative AI’s answers with enterprise data and SME knowledge, improving AI trustworthiness.

Explore semaphore

Get Started with Our AI Examples Library

Accelerate your AI implementation with our RAG examples and sample code for the most common AI use cases.

Split Documents into Chunks

See examples of how to split text into smaller chunks that can be stored in the same document or as a separate document.

Integrate with LangChain and Azure OpenAI

See how to build a RAG retriever for your AI application using a text, semantic or vector query.

Generate Vector Embeddings

Learn how to add vector embeddings to documents in MarkLogic Server with LangChain and the MarkLogic Data Movement SDK.

RAG with Vector Reranking

Watch a demo of how to orchestrate a hybrid search in MarkLoigc Server and use native vector operations to refine your results.

Related Resources

Frequently Asked Questions

MarkLogic Prefooter Background

Build an Agile RAG Architecture with MarkLogic

Develop contextualized and trustworthy generative AI-enhanced applications.