Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extension error when going beyond token limit (o1Preview) #132

Open
whackedMallard opened this issue Dec 13, 2024 · 2 comments
Open

Extension error when going beyond token limit (o1Preview) #132

whackedMallard opened this issue Dec 13, 2024 · 2 comments
Assignees
Labels
feature request The issue is a feature request

Comments

@whackedMallard
Copy link

whackedMallard commented Dec 13, 2024

WRT Version 0.8 of the extension:

  • When using the extension, I was conversing with o1Preview.
  • I then received the 'output_error' (below) in the output panel
  • I understand that I went beyond it's context window, but this should not generate an error. The context window should simply continue rolling and truncate what was sent.

<output_error>
[2024-12-12T23:39:59.505Z] [INFO] Command registration.
Connected to agent:Inference.Service.Agent pipe after retries:1
Finished agent startup...
Agent unlocked
Finished agent startup...
Agent unlocked
Information: Microsoft.Neutron.Rpc.Service.JsonRpcService [2306] 2024-12-13T10:40:01.6479238+11:00 Accepting pipe incoming pipeName:ai.19cd8dc3ab87f0271171b75092552347 numOfSession:2
[2024-12-12T23:40:02.913Z] [INFO] telemetry event:activate_extension sent
[2024-12-12T23:44:07.787Z] [INFO] Loading View: catalogModels
[2024-12-12T23:44:22.259Z] [INFO] Loading View: modelPlayground
[2024-12-13T00:07:12.803Z] [ERROR] Failed to chatStream. provider = "GitHub", model = "o1-preview", errorMessage = "Error: Unable to inference the GitHub model o1-preview due to 413. Request body too large for o1-preview model. Max size: 4000 tokens.", errorType = "c", errorObject = {}
[2024-12-13T00:07:12.804Z] [ERROR] Unable to inference the GitHub model o1-preview due to 413. Request body too large for o1-preview model. Max size: 4000 tokens.
</output_error>

Thank you for contacting us! Any issue or feedback from you is quite important to us. We will do our best to fully respond to your issue as soon as possible. Sometimes additional investigations may be needed, we will usually get back to you within 2 days by adding comments to this issue. Please stay tuned.

@microsoft-github-policy-service microsoft-github-policy-service bot added the needs attention The issue needs contributor's attention label Dec 13, 2024
@swatDong swatDong added the feature request The issue is a feature request label Dec 17, 2024
@swatDong swatDong self-assigned this Dec 17, 2024
@swatDong
Copy link
Contributor

Thanks for contacting. Created backlog for this feature request.

@swatDong swatDong removed the needs attention The issue needs contributor's attention label Dec 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request The issue is a feature request
Projects
None yet
Development

No branches or pull requests

2 participants