All Issues
24,993 verified issues
Minor version update from 3.42.2 -> 3.43.0 causes breaking changes on Node version 6.
Issue Summary I believe the issue is due to a change where the "New-Style" of the URL object is being used without condition of which version of Node is being used. Steps to Reproduce 1. Use Node version 6. 2. Run the usual steps to create a room 3. Get the following error Code Snippet [code block] Exception/Log [code block] Technical details: twilio-node version: 3.43.0 node version: 6.11.3
AttributeError: module 'socket' has no attribute 'TCP_KEEPINTVL' on Windows and macOS
I'm trying to use the anthropic-sdk on both Windows and macOS, but I'm encountering the following error on both platforms: [code block] After investigating, I found that the socket.TCP_KEEPINTVL attribute isn't available on all platforms, particularly Windows and macOS. To resolve the issue locally, I modified _base_client.py as follows: [code block] With this change, the SDK works as expected on both platforms. Would this be a valid fix to submit as a PR? Happy to open one if this approach looks good.
Missing ForwardedFrom attribute
I did set up a webhook on a french phone number (+33). Then I made a call to another phone number that is forwarding to the webhook phone number. But the ForwardedFrom attribute is missing. Is it the expected behavior?
nest_asyncio hangs AsyncAnthropic
When running with `nest_asyncio.apply()` the AsyncAnthropic execution keeps staying in the thread [code block]
Count Tokens for Bedrock
I am getting following when I am trying to calculate tokens in bedrock. AttributeError: 'AnthropicBedrock' object has no attribute 'count_tokens' It seems like there is no implementation for it. Thanks.
Support Pydantic >= 2.0.0 ?
Hello, I noticed that the project is currently depending on pydantic version "^1.9.0". Given that pydantic is already at 2.1.1, I was wondering if there are any plans to upgrade your dependencies for this project. Thanks!
Source distribution tarballs on pypi do not contain README.md, causing hatch-fancy-pypi-readme plugin to fail
The source distributions published to pypi presently cannot be built; attempts fail with: [code block] Inspecting the tarball to list non-`.py` content shows only: [code block] Notably, there is indeed no README.md file present.
File descriptor leak
I'm guessing this is just not recommended to init the Client on each api call like this but if you are to do something like the below you get an ever growing number of open files as indicated by `lsof` until I hit an `OSError`. [code block] I can fix this by calling `client._client.close()` but seems like it would be nicer to expose a `close()` method or enable use of the client as a context manager. Suggestion here: https://github.com/anthropics/anthropic-sdk-python/pull/83 Or if this is just totally the wrong way to use the client maybe this could be mentioned in the Readme. Thanks!
Using the application inference profile in Bedrock results in failed model invocations.
Amazon Bedrock has added a new feature called "application inference profiles". Using application inference profiles is like adding an alias to a base model. * Creating an application inference profile [code block] > arn:aws:bedrock:us-west-2:637423213562:application-inference-profile/hq2of259skzs For Bedrock's Invoke Model, you can specify the application inference profile as the modelId. [code block] However, when using the Anthropic SDK, specifying the application inference profile as the model results in an error. [code block] > Message(id=None, content=None, model=None, role=None, stop_reason=None, stop_sequence=None, type=None, usage=None, Output={'__type': 'com.amazon.coral.service#UnknownOperationException'}, Version='1.0') This is likely because the model parameter is not expecting an ARN to be set. Please let me know if you have any further questions regarding this.
Wrong request options in retries
We are using `AsyncAnthropicBedrock` class. We noticed that when a request is automatically retried, the request options do not have the correct values, resulting in 400 - Bad request. Example logs (actual content redacted): [code block] This started happening with version 0.31.0. Doesn't happen with version 0.30.1. Maybe has something to do with https://github.com/anthropics/anthropic-sdk-python/pull/580 ?
Model Claude 3 Haiku is not available ?
I tried to contact with supporters, but I can not find them out. Could developers tell me when Claude 3 haiku model is available? I tried with console playground but no model 3 haiku. Thanks
ModuleNotFoundError: No module named 'tokenizers.tokenizers'
While using the latest version of anthropic, I'm seeing the above error. I used traceback and its root cause seems to be the tokenizer package: `line 19, in <module>\n from anthropic import AuthenticationError, NotFoundError\n File \"/opt/python/anthropic/__init__.py\", line 6, in <module>\n from ._client import (\n File \"/opt/python/anthropic/_client.py\", line 10, in <module>\n from tokenizers import Tokenizer type: ignore[import]\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/opt/python/tokenizers/__init__.py\", line 78, in <module>\n from .tokenizers import (\nModuleNotFoundError: No module named 'tokenizers.tokenizers'\n` Any ideas on how to resolve this? I also opened an issue on tokenizers library: https://github.com/huggingface/tokenizers/issues/1416 Installed: `anthropic==0.7.8`, `tokenizers==0.15.0`
Request without tools definition.
Error code: 400 - {'type': 'error', 'error': {'type': 'invalid_request_error', 'message': 'Requests which include tool_use or tool_result blocks must define tools.'}} I believe this issue could be better addressed directly to the Anthropic Server Team, but I have not found a way to report this issue directly. When interacting with the Anthropic API, my initial request includes the necessary tool definitions, which are processed correctly. However, for certain reasons, I need to remove the tool definitions after responding to the assistant's tool_use request appropriately. Subsequent messages sent to the Anthropic API without tool definitions then trigger the error mentioned above. It is a common scenario to toggle between states where tools are either provided or not, as sometimes it is necessary to manually avoid unintended tool invocations. The simplest method is not to include tool definitions when they are not needed. However, the current design of the API seems to require that once tools have been invoked, all subsequent requests must include tool definitions. This requirement seems unreasonable. We hope that Anthropic will consider this use case and make necessary adjustments.
pydantic 1.10.13 error on anthropic 0.26.1
Just a heads up. I am running the __Anthropic 0.26.1__ with __pydantic version 1.10.13__ and is throwing an error: _"ImportError: cannot import name 'version_short' from 'pydantic.version' (/opt/conda/lib/python3.10/site-packages/pydantic/version.cpython-310-x86_64-linux-gnu.so)"_ . The pyproject.toml asks for "pydantic>=1.9.0, <3"
Output token usage being misreported as 1 when using streaming
In the final Message object that the Python SDK constructs when streaming completes (`stream.get_final_message()`). `usage.output_tokens` is always reported as 1 regardless of actual output length. I believe this object is supposed to be comparable to the Message object you would work with in non-streaming, so I would expect this to report the total output tokens.
Claude claude-v1-100k showing 10k token limit
I have API access and am testing `claude-v1-100k`. I call API passing in `text` to the prompt, which is a large doc. [code block] I see max token count ~10k: [code block] Is it possible that some users do not have access to `claude-v1-100k`? I also tried with a second API key.
Find API Key on anthropic.com
Where can I find my API key? Super interested in testing this out 🚀
Feature request: tool_choice='none' - for sending tool results, but ensuring another tool call does not occur
Currently, when sending tool results, the tool definitions must be included in the result. However, sometimes when sending tool results, a user may want to restrict the model not to call another tool. But since tool choice may not be set to "none" and the tool definitions are still present, there is no way to restrict Claude from calling any additional tools when sending tool results. [code block]
Skill use in v0.71.0 doesn't work
According to the API documentation, following code example I could use to run Agent Skills: [code block] But this code doesn't work in version 0.71.0, because I got a TypeError: "TypeError: Messages.create() got an unexpected keyword argument 'container'". It looks like this version still doesn't fully support Agent Skills. Am I right?
messages.create() takes a JSON schema dict but messages.stream() takes a Pydantic model
Just ran into this slight inconsistency in the API relating to structured content support. Here's the `messages.create()` idea of `output_format=`: https://github.com/anthropics/anthropic-sdk-python/blob/d9aea38e754d55f8f0875fdf19ee44f78ca7b845/src/anthropic/resources/beta/messages/messages.py#L106 Which uses: https://github.com/anthropics/anthropic-sdk-python/blob/d9aea38e754d55f8f0875fdf19ee44f78ca7b845/src/anthropic/types/beta/beta_json_output_format_param.py#L11-L15 So it accepts a Python dictionary that represents a JSON schema. But the `messages.stream()` method: https://github.com/anthropics/anthropic-sdk-python/blob/d9aea38e754d55f8f0875fdf19ee44f78ca7b845/src/anthropic/resources/beta/messages/messages.py#L1351 Later does this: https://github.com/anthropics/anthropic-sdk-python/blob/d9aea38e754d55f8f0875fdf19ee44f78ca7b845/src/anthropic/resources/beta/messages/messages.py#L1385-L1386 `TypeAdapter` is a Pydantic concept, so this has the effect of only allowing Pydantic types to be passed to `messages.stream()`. It's confusing and inconsistent that `messages.create()` requires a JSON schema dictionary but `messages.stream()` instead requires a Pydantic model.