Technical Architecture

<< Click to Display Table of Contents >>

Raynet One > 2026.1 > User Guide > First impressions > Recurring areas and elements > AI Chat 

Technical Architecture

This section describes the internal technical architecture of the AI Chat feature, including its core components, communication flow, external service integrations, and security principles.

 

 

Server-Side Message Storage (Chat Storage Layer)

The AI Chat uses a centralized server-side storage model.

 

Key characteristics:

 

All chat sessions are stored in the backend database.

Persistent context handling across sessions.

Support for multiply parallel chat sessions per user.

Audit logging and traceability.

Token usage tracking.

Multiple chat session.

No local browser storage dependency.

 

This ensures reliability, traceability, and full server control over chat data.

 

 

Real-time Communication

The AI Chat uses WebSocket technology to provide real-time responses. When you send a message:

 

1.Your message is sent to the AI service.

2.The assistant processes your request and may query the database if needed.

3.You will see status updates showing what the assistant is working on.

4.The response streams back to you in real-time.

 

 

Secure Communication Flow (AI Communication Layer)

The frontend does not communicate directly with any AI provider.

Instead, the communication flow is:

 

Frontend < Backend < LiteLLM < AI Provider

 

Key principles:

 

Frontend never talks directly to AI provider.

Backend handles all AI requests.

All AI requests are handled server-side.

API keys and credentials remain on the server.

Request validation & sanitization before provider call.

Rate limiting & retry logic.

Centralized Logging & cost tracking.

Abstraction layer (LiteLLM).

Full control over request orachestration.

 

This architecture ensures security, maintainability, and provider independence.

 

 

AI Provider Abstraction Layer (LiteLLM)

LiteLLM acts as an abstraction layer between Raynet One and external AI providers.

 

Responsibilities:

 

Provider abstraction (OpenAI, Azure, etc.)

AI Model switching without application code changes

Centralized model configuration

secure API key management

Usage & cost metrics collection

 

This design allows flexible provider switching and centralized AI governance.

 

 

Voice Architecture & Integration

Voice capabilities are integrated via external speech services.

 

Technical components:

 

Speech-to-Text (STT) via Azure Speech API

Text-to-Speach (TTS) via Azure Speech API

Server-side processing and orchestration

Browser independent voice handling

Custom TTS provider instead of browser-native speechSynthesis

Voice command handling logic (stop, pause, resume)

 

The voice layer is fully integrated into the backend-controlled AI communication flow.

 

Voice capabilities are optional and remain unavailable unless the Speech integration is configured. Configuration is performed via the Voice Control integration settings (see AI Chat Configuration & Deployment > Voice Mode Configuration). The voice layer follows the same backend-controlled communication flow, security principles, and permissions boundaries as text-based interaction.

 

 

External Service Dependencies (License Management Context)

The License Management workspace relies on multiple external services to provide AI-supported functionality and enriched data processing capabilities.

 

All services listed below are mandatory components. The License Management workspace is not operational unless these integrations are fully configured.

 

 

SAMCloud Integration

 

SAMCloud provides AI-enhanced services specifically for License Management.

 

Responsibilites:

 

Document parsing (e.g., invoices, license contracts)

Extraction of structured core information from uploaded files

Access to the Encyclopedia service

Additional AI-driven analysis services

 

Configuration (Docker example - sanitized):

 

SAMCloud__Url: "<SAMCLOUD_URL>"

SAMCloud__ApiKey: "<SAMCLOUD_API_KEY>"

SAMCloud__MaxAttempts: 5

MaxPollingAttempts: 6

PollingDelaySeconds: 10

 

 

Without SAMCloud, License management remains operational but lacks automated document analysis and enriched AI capabilities.

 

 

Technology Catalog Integration

 

The Technology Catalog provides structured product and technology data used within License Management and AI-assisted workflows.

 

Responsibilities:

 

Enriched product metadata

Technology normalization

Structured software and vendor information

Data foundation for AI-supported analysis

 

Configuration (Docker example - sanitized):

 

Catalog__Url: "<CATALOG_SERVICE_URL>"

Catalog__ApiKey: "<CATALOG_API_KEY>"

Catalog__MaxAttempts: 3

 

The Catalog service enhances the quality and consistency of license and product-related insights.

 

SAMCloud provides AI-enhanced services specifically for License Management.

 

 

AI Chat /LLM Integration

 

AI Chat enables natural language interaction and AI-supported analysis within the system.

 

Responsibilites:

 

Natural language query handling

Context-aware responses

AI-assisted evaluation of license and asset data

 

The AI Chat backend communicates with the configured AI provider through the LiteLLM abstraction layer.

 

Configuration (Docker example - sanitized):

 

AiConfig__Url: "<AI_SERVICE_URL>"

AiConfig__ApiKey: "<AI_API_KEY>"

AiConfig__Model: "<MODEL_NAME>"

 

 

Architectural Summary

For full License Management capabilities:

 

SAMCloud enables document intelligence and AI-enhanced parsing

The Technology Catalog provides structured product intelligence

AI Chat enables conversational interaction and AI-assisted analysis

 

Together, these services from the AI supported License Management ecosystem.