Comments (6)
Hi @ekdnam, thank you for creating this issue. Would you please double check if you have the two required models deployed to your AzureOpenAI instance:
- gpt-35-turbo
- text-embedding-ada-002
When setting things up by running the install script, it's important that the Deployment
names of the models are used if they are different from the model names:
-
IMPORTANT: For
AzureOpenAI
, if you deployed modelsgpt-35-turbo
andtext-embedding-ada-002
with custom names (instead of each own's given name), also use the parameters:```powershell -CompletionModel {DEPLOYMENT_NAME} -EmbeddingModel {DEPLOYMENT_NAME} -PlannerModel {DEPLOYMENT_NAME} ```
from chat-copilot.
@alliscode I had forgotten to deploy embeddings to AzureOpenAI. I was able to solve that error. Thanks!
However, I got a new error - please find the trace below
fail: Microsoft.SemanticKernel.IKernel[0]
Something went wrong while executing the native function. Function: <GetDelegateInfo>b__0. Error: Service error: The service failed to process the request, HTTP status:500
Microsoft.SemanticKernel.AI.AIException: Service error: The service failed to process the request, HTTP status:500
---> Azure.RequestFailedException: Service request failed.
Status: 500 (Internal Server Error)
Content:
{ "statusCode": 500, "message": "Internal server error", "activityId": "04ac5880-7bfb-4987-96e8-4b18d67a8f36" }
Headers:
x-ms-client-request-id: 22cdc27b-42dc-434d-b261-d4b990a1c56e
apim-request-id: REDACTED
Strict-Transport-Security: REDACTED
X-Content-Type-Options: REDACTED
x-ms-region: REDACTED
Date: Tue, 08 Aug 2023 05:27:42 GMT
Content-Length: 111
Content-Type: application/json
at Azure.Core.HttpPipelineExtensions.ProcessMessageAsync(HttpPipeline pipeline, HttpMessage message, RequestContext requestContext, CancellationToken cancellationToken)
at Azure.AI.OpenAI.OpenAIClient.GetEmbeddingsAsync(String deploymentOrModelName, EmbeddingsOptions embeddingsOptions, CancellationToken cancellationToken)
at Microsoft.SemanticKernel.Connectors.AI.OpenAI.AzureSdk.ClientBase.RunRequestAsync[T](Func`1 request)
--- End of inner exception stack trace ---
at Microsoft.SemanticKernel.Connectors.AI.OpenAI.AzureSdk.ClientBase.RunRequestAsync[T](Func`1 request)
at Microsoft.SemanticKernel.Connectors.AI.OpenAI.AzureSdk.ClientBase.InternalGetEmbeddingsAsync(IList`1 data, CancellationToken cancellationToken)
at Microsoft.SemanticKernel.AI.Embeddings.EmbeddingGenerationExtensions.GenerateEmbeddingAsync[TValue,TEmbedding](IEmbeddingGeneration`2 generator, TValue value, CancellationToken cancellationToken)
at Microsoft.SemanticKernel.Memory.SemanticTextMemory.SearchAsync(String collection, String query, Int32 limit, Double minRelevanceScore, Boolean withEmbeddings, CancellationToken cancellationToken)+MoveNext()
at Microsoft.SemanticKernel.Memory.SemanticTextMemory.SearchAsync(String collection, String query, Int32 limit, Double minRelevanceScore, Boolean withEmbeddings, CancellationToken cancellationToken)+System.Threading.Tasks.Sources.IValueTaskSource<System.Boolean>.GetResult()
at SemanticKernel.Service.CopilotChat.Skills.ChatSkills.SemanticChatMemorySkill.QueryMemoriesAsync(String query, String chatId, Int32 tokenLimit, ISemanticTextMemory textMemory) in /Users/ekdnam/Projects/copilot-chat-app/webapi/CopilotChat/Skills/ChatSkills/SemanticChatMemorySkill.cs:line 57
at SemanticKernel.Service.CopilotChat.Skills.ChatSkills.SemanticChatMemorySkill.QueryMemoriesAsync(String query, String chatId, Int32 tokenLimit, ISemanticTextMemory textMemory) in /Users/ekdnam/Projects/copilot-chat-app/webapi/CopilotChat/Skills/ChatSkills/SemanticChatMemorySkill.cs:line 57
at SemanticKernel.Service.CopilotChat.Skills.ChatSkills.ChatSkill.GetChatResponseAsync(String chatId, SKContext chatContext) in /Users/ekdnam/Projects/copilot-chat-app/webapi/CopilotChat/Skills/ChatSkills/ChatSkill.cs:line 341
at SemanticKernel.Service.CopilotChat.Skills.ChatSkills.ChatSkill.ChatAsync(String message, String userId, String userName, String chatId, String messageType, String planJson, String messageId, SKContext context) in /Users/ekdnam/Projects/copilot-chat-app/webapi/CopilotChat/Skills/ChatSkills/ChatSkill.cs:line 267
at Microsoft.SemanticKernel.SkillDefinition.SKFunction.InvokeAsync(SKContext context, CompleteRequestSettings settings)
fail: Microsoft.SemanticKernel.IKernel[0]
Function call fail during pipeline step 0: ChatSkill.Chat. Error: Service error: The service failed to process the request, HTTP status:500
Hello, world!
Does this mean I have to use a query vector store? Nothing in-memory?
Or is it something different all together?
from chat-copilot.
@ekdnam is this error persistent? Generally, 500 level errors are indicating a transient error within the service itself and are considered retriable. If the issue is persistent you might want to configure logging in your Azure OpenAI instance to get more information about what is going wrong.
As for the vector store, the default setup is to use volatile
(in memory), but this can be configured in the MemoryStore
section of the appsettings.json
file within the webapi
.
from chat-copilot.
@ekdnam We've verified that Azure OpenAI experienced intermittent outage yesterday. Can you confirm if you are still experiencing this 500 error?
from chat-copilot.
@alliscode thanks for letting me know.
The app is working now.
from chat-copilot.
Thanks for your help! I am closing the issue now.
from chat-copilot.
Related Issues (20)
- Webapi exception HOT 2
- Skip vectorization and use the full document
- You must call and await the initialize function before attempting to call any other MSAL API HOT 10
- How do I add support for csv, ppt, and support for other document types? HOT 1
- Unable to generate bot response. Details: Error: 500: Internal Server Error HOT 1
- Unable to access the web app HOT 7
- Conflict between appsettings and code HOT 1
- Infinite questions in copilot HOT 1
- Chat memories are not deleted when chat session is deleted HOT 6
- ChatPlugin crashes on Azure OpenAI gpt-35-turbo 1106 HOT 4
- What's the default queue name for AzureQueue? HOT 5
- Power Platform Chatbot Copilot Studio: Issue with publishing Copilots HOT 1
- User Feedback content violation
- After running the memory pipeline, it keeps throwing an error: 'File not found: chatmemory/{GUID}/__pipeline_status.json HOT 2
- Can't retrieve document memory in chat HOT 6
- Cannot read properties of undefined (reading 'getLogger') HOT 6
- Definition of quality health care HOT 1
- Definition of quality health care HOT 1
- Getting 404 - Ressource not found HOT 2
- 我如何在本地部署
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from chat-copilot.