AssistantServiceChat Method

Handles a chat request by streaming responses from the backend.

Definition

Namespace: SuperBuilderWinService.Services
Assembly: IntelAiaService (in IntelAiaService.exe) Version: 1.0.0+2093811f3bac5da092b5ce9c8172233582eee4d5
C#
public override Task Chat(
	ChatRequest request,
	IServerStreamWriter<ChatResponse> responseStream,
	ServerCallContext context
)

Parameters

request  ChatRequest
LLM-Only Example: { "name": "UI", "prompt": "What is the weather today?", "history": [ { "role": "user", "content": "Hello" }, { "role": "assistant", "content": "Hi, how can I help you?" } ], "sessionId": 123, // optional but necessary for saving message and response to session history "attachedFiles": "[]" // empty list means } RAG + LLM Example: { "name": "UI", "prompt": "Who is Mercy Baker?", "attachedFiles": "[\"/path/to/file1.txt\", \"/path/to/file1.txt\"]" // Note: do not send in attached files to use ALL files in the vectordb } Special Workflows Example: { "name": "UI", "prompt": "Lead Cloud Architect at Intel", "history": [ { "role": "user", "content": "Hello" }, { "role": "assistant", "content": "Hi, how can I help you?" } ], "sessionId": 123, "promptOptions": { // send in the oneof field to specify the workflow type, and fill out all the prompt parameters it requires (if any) // In this example: The ScoreResumes workflow is selected with prompt options specified, when PromptOptions is unset it will default to generic chat "scoreResumesPrompt": { "isScoringCriteria": true } }, "attachedFiles": "[\"/path/to/resume1.pdf\", \"/path/to/resume2.pdf\"]" }
responseStream  IServerStreamWriterChatResponse
The stream to send chat responses back to the client.
context  ServerCallContext
The gRPC call context.

Return Value

Task
Streams chat responses to the client. Example: { "message": "{ \"message\": \"weather \" }" }

See Also