Ddtrace

Latest version: v2.17.3

Safety actively analyzes 688792 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 4 of 44

2.14.3

Bug Fixes

- Code Security (IAST)
- Ensures that only the IAST propagation context is cleared instead of all contexts, which could otherwise cause propagation loss in multithreaded applications. Additionally, it improves validations in both the Processor and Vulnerability Reporter, depending on whether IAST is active or not.
- Profiling
- Fixes endpoint profiling for stack v2, when ``DD_PROFILING_STACK_V2_ENABLED`` is set.
- Tracing
- Ensures `DD_TRACE_RATE_LIMIT` environment variable is only applied to spans for which tracer sampling is configured. For spans not matching sampling rules default rate limits should be applied by the Datadog Agent.

---

2.14.2

Bug Fixes

- Tracing
- celery: Fixes an issue where `celery.apply` spans didn't close if the `after_task_publish` or `task_postrun` signals didn't get sent when using `apply_async`, which can happen if there is an internal exception during the handling of the task. This update also marks the span as an error if an exception occurs.
- celery: Fixes an issue where `celery.apply` spans using task_protocol 1 didn't close by improving the check for the task id in the body.

- Profiling
- All files with platform-dependent code have had their filenames updated to reflect the platform they are for. This fixes issues where the wrong file would be used on a given platform.
- Enables code provenance when using libdatadog exporter, `DD_PROFILING_EXPORT_LIBDD_ENABLED`, `DD_PROFILING_STACK_V2_ENABLED`, or `DD_PROFILING_TIMELINE_ENABLED`.
- Fixes an issue where flamegraph was upside down for stack v2, `DD_PROFILING_STACK_V2_ENABLED`.

2.14.1

New Features

- Code Security (IAST): Always report a telemetry log error when an IAST propagation error raises, regardless of whether the `_DD_IAST_DEBUG` environment variable is enabled or not.

Bug Fixes

- tracing: Removes a reference cycle that caused unnecessary garbage collection for top-level spans.
- Code Security: fix potential memory leak on IAST exception handling.
- profiling: Fixes endpoint profiling when using libdatadog exporter, either with `DD_PROFILING_EXPORT_LIBDD_ENABLED` or `DD_PROFILING_TIMELINE_ENABLED`.

2.14.0

Deprecation Notes
- Tracing
- Deprecates the `DD_TRACE_SPAN_AGGREGATOR_RLOCK` environment variable. It will be removed in v3.0.0.
- Deprecates support for [APM Legacy App Analytics](https://docs.datadoghq.com/tracing/legacy_app_analytics/). This feature and its associated configuration options are deprecated and will be removed in v3.0.0.
- `DD_HTTP_CLIENT_TAG_QUERY_STRING` configuration is deprecated and will be removed in v3.0.0. Use `DD_TRACE_HTTP_CLIENT_TAG_QUERY_STRING` instead.

New Features
- DSM
- Introduces new tracing and datastreams monitoring functionality for Avro Schemas.
- Introduces new tracing and datastreams monitoring functionality for Google Protobuf.

- LLM Observability
- Adds support to automatically submit Gemini Python SDK calls to LLM Observability.
- The OpenAI integration now captures tool calls returned from streamed responses when making calls to the chat completions endpoint.
- The LangChain integration now submits tool spans to LLM Observability.
- LLM Observability spans generated by the OpenAI integration now have updated span name and `model_provider` values. Span names are now prefixed with the OpenAI client name (possible values: `OpenAI/AzureOpenAI`) instead of the default `openai` prefix to better differentiate whether the request was made to Azure OpenAI or OpenAI. The `model_provider` field also now corresponds to `openai` or `azure_openai` based on the OpenAI client.
- The OpenAI integration now ensures accurate token data from streamed OpenAI completions and chat completions, if provided in the streamed response. To ensure accurate token data in the traced streamed operation, ensure that the `stream_options={"include_usage": True}` option is set on the completion or chat completion call.
- Introduces the `LLMObs.annotation_context()` context manager method, which allows modifying the tags of integration generated LLM Observability spans created while the context manager is active.
- Introduces prompt template annotation, which can be passed as an argument to `LLMObs.annotate(prompt={...})` for LLM span kinds. For more information on prompt annotations, see [the docs](https://docs.datadoghq.com/llm_observability/setup/sdk/#annotating-a-span).
- google_generativeai: Introduces tracing support for Google Gemini API `generate_content` calls.
See [the docs](https://ddtrace.readthedocs.io/en/stable/integrations.html#google_generativeai) for more information.
- openai: The OpenAI integration now includes a new `openai.request.client` tag with the possible values `OpenAI/AzureOpenAI` to help differentiate whether the request was made to Azure OpenAI or OpenAI.
- openai: The OpenAI integration now captures token data from streamed completions and chat completions, if provided in the streamed response. To ensure accurate token data in the traced streamed operation, ensure that the `stream_options={"include_usage": True}` option is set on the completion or chat completion call.

- Profiling
- Captures `asyncio.Lock` usages with `with` context managers.

- Other
- botocore: Adds span pointers to some successful AWS botocore spans. Currently only supports S3 PutObject.
- pymongo: Adds support for pymongo>=4.9.0


Bug Fixes
- Code Security (ASM)
- Fixes a bug in the IAST patching process where `AttributeError` exceptions were being caught, interfering with the proper application cycle.
- Resolves an issue where exploit prevention was not properly blocking requests with custom redirection actions.

- LLM Observability
- Fixes an issue where the OpenAI and LangChain integrations would still submit integration metrics even in agentless mode. Integration metrics are now disabled if using agentless mode via `LLMObs.enable(agentless_enabled=True)` or setting `DD_LLMOBS_AGENTLESS_ENABLED=1`.
- Resolves an issue in the `LLMObs.annotate()` method where non-JSON serializable arguments were discarded entirely. Now, the `LLMObs.annotate()` method safely handles non-JSON-serializable arguments by defaulting to a placeholder text.
- Resolves an issue where attempting to tag non-JSON serializable request/response parameters resulted in a `TypeError` in the OpenAI, LangChain, Bedrock, and Anthropic integrations.
- anthropic: Resolves an issue where attempting to tag non-JSON serializable request arguments caused a `TypeError`. The Anthropic integration now safely tags non-JSON serializable arguments with a default placeholder text.
- langchain: Resolves an issue where attempting to tag non-JSON serializable tool config arguments resulted in a `TypeError`. The LangChain integration now safely tags non-JSON serializable arguments with a default placeholder text.

- Other
- SSI: This fix ensures injection denylist is included in published OCI package.
- postgres: Fixes circular imports raised when psycopg automatic instrumentation is enabled.
- pymongo: Ensures instances of the `pymongo.MongoClient` can be patch after pymongo is imported

2.13.3

Bug Fixes

- CI Visibility
- Fixes a bug where `CODEOWNERS` would incorrectly fail to discard line-level trailing comments (eg: `code/owner my comment` would result in codeowners being parsed as `code/owner`, ``, `my`, and `comment`)
- Fixes unnecessary logging of an exception that would appear when trying to upload git metadata in an environment without functioning git (eg: missing `git` binary or `.git` directory)

- Code security
- Resolves an issue where partial matches on function names we aimed to patch were being patched instead of full matches on them.
- Resolves an issue where importing the `google.cloud.storage.batch` module would fail raising an ImportError

- LLM Observability
- Resolves an issue where LLM Observability evaluation metrics were not being submitted in forked processes. The evaluation metric writer thread now automatically restarts when a forked process is detected.
- Resolves an issue where input and output values equal to zero were not being annotated on workflow, task, agent and tool spans when using `LLMObs.annotate`.

- Profiling
- Improves the error message when the native exporter fails to load and stops profiling from starting if ddtrace is also being injected.
- Fixes a data race where span information associated with a thread was read and updated concurrently, leading to segfaults
- Resolves an issue where endpoint profiling for stack v2 throws `TypeError` exception when it is given a `Span` with `None` span_type.

- Tracing
- `elasticsearch`: Resolves an issue where span tags were not fully populated on "sampled" spans, causing metric dimensions to be incorrect when spans were prematurely marked as sampled, including resource_name.


---

2.13.2

Bug Fixes
- Code Security
- Ensures IAST propagation does not raise side effects related to `re.finditer`.
- LLM Observability
- botocore: Fixes bedrock model and model provider interpretation from `modelId` when using cross-region inference.
- Profiling
- Fixes an issue where stack v2 couldn't be enabled as pthread was not properly linked on some debian based images for aarch64 architecture.
- Tracing
- Resolves the issue where tracer flares would not be generated if unexpected types were received in the `AGENT_CONFIG` remote configuration product.

---

Page 4 of 44

Links

Releases

Has known vulnerabilities

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.