DIN: Data Intelligence Network
  • Data Intelligence Network - The Blockchain for AI
    • Overview
    • Purpose and scope of this whitepaper
  • Market and Trend Analysis
    • Overview of the current data trend and market
    • Overview of the current AI trend and market
    • Existing gaps and opportunities in the market
  • Data Layer: All for the Data
    • Data Flow of AI
    • DIN Protocol Architecture
    • Data Collection
    • Data Validation
    • Data Vectorization
    • The Reward Mechnism
  • Service Layer - Toolkit for dAI-Apps
    • LLMOps
    • RAG (Retrieval Augmented Generation)
      • Hybrid Search
      • Rerank
      • Retrieval
    • Annotation Reply
  • Application Layer: The Ecosystem and Product
    • Analytix
    • xData
    • Reiki
  • Tokenomics and Utilities
    • Details about the $DIN Token.
    • Use cases for the token within the ecosystem
  • Future Outlook
    • Roadmap in 2024
    • Future Developments of DIN
      • Data Marketplace
      • The Multi-Agent system(MAS)
  • References
    • Citations and Sources
    • Glossary of Terms
Powered by GitBook
On this page
  1. Service Layer - Toolkit for dAI-Apps

Annotation Reply

The Annotation Reply offers tailored, high-quality replies for various applications, achieved through manual annotation.

Key Uses:

  1. Specialized Replies for Specific Sectors: This is particularly valuable in customer service or knowledge bases within business, government, etc. It allows for precise answers to specific questions by annotating replies, such as setting "standard annotations" for some or marking others as "unanswerable."

  2. Quick Adaptation for Prototypes: Utilizing annotation replies can significantly improve reply quality in the rapid development of prototype products, enhancing customer satisfaction.

How It Works:

An alternative system is provided to enhance retrieval, skip the generation phase of Large Language Models (LLMs), and avoid the complications of Retrieval-Augmented Generation (RAG).

  1. Once activated, the user can annotate LLM dialogue replies. Annotations can be high-quality answers taken directly from the LLM or edited annotations. These annotated contents are saved for future use.

  2. When similar questions are asked again, the system identifies matching annotated questions.

  3. If a match is found, the annotated answer is returned directly, bypassing LLM or RAG processes.

  4. Without a match, the query follows the standard LLM or RAG process.

  5. Deactivating Annotation Reply ceases matching replies from the annotations.

PreviousRetrievalNextApplication Layer: The Ecosystem and Product

Last updated 1 year ago