Skip to content

chore(deps): bump the gha group across 1 directory with 15 updates#3758

Open
dependabot[bot] wants to merge 1 commit intomainfrom
dependabot/uv/packages/opentelemetry-instrumentation-llamaindex/gha-b7dcf0291a
Open

chore(deps): bump the gha group across 1 directory with 15 updates#3758
dependabot[bot] wants to merge 1 commit intomainfrom
dependabot/uv/packages/opentelemetry-instrumentation-llamaindex/gha-b7dcf0291a

Conversation

@dependabot
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Mar 4, 2026

Bumps the gha group with 15 updates in the /packages/opentelemetry-instrumentation-llamaindex directory:

Package From To
opentelemetry-semantic-conventions-ai 0.4.13 0.4.15
llama-index 0.14.12 0.14.15
ruff 0.14.11 0.15.4
chromadb 0.5.23 1.5.2
llama-index-embeddings-openai 0.5.1 0.5.2
llama-index-llms-cohere 0.6.1 0.7.1
llama-index-llms-openai 0.6.13 0.6.25
llama-index-postprocessor-cohere-rerank 0.5.1 0.7.0
onnxruntime 1.19.2 1.24.2
openai 1.109.1 2.24.0
opentelemetry-instrumentation-chromadb 0.50.1 0.52.6
opentelemetry-instrumentation-cohere 0.50.1 0.52.6
opentelemetry-instrumentation-openai 0.50.1 0.52.6
pytest-asyncio 0.23.8 1.3.0
sqlalchemy 2.0.45 2.0.48

Updates opentelemetry-semantic-conventions-ai from 0.4.13 to 0.4.15

Updates llama-index from 0.14.12 to 0.14.15

Release notes

Sourced from llama-index's releases.

v0.14.15

Release Notes

[2026-02-18]

llama-index-agent-agentmesh [0.1.0]

  • [Integration] AgentMesh: Trust Layer for LlamaIndex Agents (#20644)

llama-index-core [0.14.15]

  • Support basic operations for multimodal types (#20640)
  • Feat recursive llm type support (#20642)
  • fix: remove redundant metadata_seperator field from TextNode (#20649)
  • fix(tests): update mock prompt type in mock_prompts.py (#20661)
  • Feat multimodal template var formatting (#20682)
  • Feat multimodal prompt templates (#20683)
  • Feat multimodal chat prompt helper (#20684)
  • Add retry and error handling to BaseExtractor (#20693)
  • ensure at least one message/content block is returned by the old memory (#20729)

llama-index-embeddings-ibm [0.6.0.post1]

  • chore: Remove persistent_connection parameter support, update (#20714)
  • docs: Update IBM docs (#20718)

llama-index-llms-anthropic [0.10.9]

  • Sonnet 4-6 addition (#20723)

llama-index-llms-bedrock-converse [0.12.10]

  • fix(bedrock-converse): ensure thinking_delta is populated in all chat modes (#20664)
  • feat(bedrock-converse): Add support for Claude Sonnet 4.6 (#20726)

llama-index-llms-ibm [0.7.0.post1]

  • chore: Remove persistent_connection parameter support, update (#20714)
  • docs: Update IBM docs (#20718)

llama-index-llms-mistralai [0.10.0]

  • Rrubini/mistral azure sdk (#20668)

llama-index-llms-oci-data-science [1.0.0]

  • Add support for new OCI DataScience endpoint /predictWithStream for streaming use case (#20545)

llama-index-observability-otel [0.3.0]

... (truncated)

Changelog

Sourced from llama-index's changelog.

llama-index-core [0.14.15]

  • Support basic operations for multimodal types (#20640)
  • Feat recursive llm type support (#20642)
  • fix: remove redundant metadata_seperator field from TextNode (#20649)
  • fix(tests): update mock prompt type in mock_prompts.py (#20661)
  • Feat multimodal template var formatting (#20682)
  • Feat multimodal prompt templates (#20683)
  • Feat multimodal chat prompt helper (#20684)
  • Add retry and error handling to BaseExtractor (#20693)
  • ensure at least one message/content block is returned by the old memory (#20729)

llama-index-embeddings-ibm [0.6.0.post1]

  • chore: Remove persistent_connection parameter support, update (#20714)
  • docs: Update IBM docs (#20718)

llama-index-llms-anthropic [0.10.9]

  • Sonnet 4-6 addition (#20723)

llama-index-llms-bedrock-converse [0.12.10]

  • fix(bedrock-converse): ensure thinking_delta is populated in all chat modes (#20664)
  • feat(bedrock-converse): Add support for Claude Sonnet 4.6 (#20726)

llama-index-llms-ibm [0.7.0.post1]

  • chore: Remove persistent_connection parameter support, update (#20714)
  • docs: Update IBM docs (#20718)

llama-index-llms-mistralai [0.10.0]

  • Rrubini/mistral azure sdk (#20668)

llama-index-llms-oci-data-science [1.0.0]

  • Add support for new OCI DataScience endpoint /predictWithStream for streaming use case (#20545)

llama-index-observability-otel [0.3.0]

  • improve otel data serialization by flattening dicts (#20719)
  • feat: support custom span processor; refactor: use llama-index-instrumentation instead of llama-index-core (#20732)

llama-index-program-evaporate [0.5.2]

  • Sandbox LLM-generated code execution in EvaporateExtractor (#20676)

llama-index-readers-bitbucket [0.4.2]

... (truncated)

Commits
  • 4937fc0 Release 0.14.15 (#20735)
  • 9869893 feat(bedrock-converse): Add support for Nova 2 (#20736)
  • 40da244 fix(layoutir): restrict requires-python to >=3.12 to match layoutir dependenc...
  • 6504188 feat: support custom span processor; refactor: use llama-index-instrumentatio...
  • dc716d1 chore: update issue classifier action to v0.2.0 (#20734)
  • 6d0aff4 ensure at least one message/conent block is returned by the old memory (#20729)
  • fdcc72c feat: add issue classifier gh action (#20720)
  • 171ae83 fix: Update WhatsAppChatLoader to retrieve DataFrame in pandas format (#20722)
  • 68c760a fix(layoutir): hotfix for output_dir crash and Block extraction (#20708 follo...
  • 83f45ce Add retry and error handling to BaseExtractor (#20693)
  • Additional commits viewable in compare view

Updates ruff from 0.14.11 to 0.15.4

Release notes

Sourced from ruff's releases.

0.15.4

Release Notes

Released on 2026-02-26.

This is a follow-up release to 0.15.3 that resolves a panic when the new rule PLR1712 was enabled with any rule that analyzes definitions, such as many of the ANN or D rules.

Bug fixes

  • Fix panic on access to definitions after analyzing definitions (#23588)
  • [pyflakes] Suppress false positive in F821 for names used before del in stub files (#23550)

Documentation

  • Clarify first-party import detection in Ruff (#23591)
  • Fix incorrect import-heading example (#23568)

Contributors

Install ruff 0.15.4

Install prebuilt binaries via shell script

curl --proto '=https' --tlsv1.2 -LsSf https://github.com/astral-sh/ruff/releases/download/0.15.4/ruff-installer.sh | sh

Install prebuilt binaries via powershell script

powershell -ExecutionPolicy Bypass -c "irm https://github.com/astral-sh/ruff/releases/download/0.15.4/ruff-installer.ps1 | iex"

Download ruff 0.15.4

File Platform Checksum
ruff-aarch64-apple-darwin.tar.gz Apple Silicon macOS checksum
ruff-x86_64-apple-darwin.tar.gz Intel macOS checksum
ruff-aarch64-pc-windows-msvc.zip ARM64 Windows checksum
ruff-i686-pc-windows-msvc.zip x86 Windows checksum
ruff-x86_64-pc-windows-msvc.zip x64 Windows checksum
ruff-aarch64-unknown-linux-gnu.tar.gz ARM64 Linux checksum
ruff-i686-unknown-linux-gnu.tar.gz x86 Linux checksum
ruff-powerpc64-unknown-linux-gnu.tar.gz PPC64 Linux checksum

... (truncated)

Changelog

Sourced from ruff's changelog.

0.15.4

Released on 2026-02-26.

This is a follow-up release to 0.15.3 that resolves a panic when the new rule PLR1712 was enabled with any rule that analyzes definitions, such as many of the ANN or D rules.

Bug fixes

  • Fix panic on access to definitions after analyzing definitions (#23588)
  • [pyflakes] Suppress false positive in F821 for names used before del in stub files (#23550)

Documentation

  • Clarify first-party import detection in Ruff (#23591)
  • Fix incorrect import-heading example (#23568)

Contributors

0.15.3

Released on 2026-02-26.

Preview features

  • Drop explicit support for .qmd file extension (#23572)

    This can now be enabled instead by setting the extension option:

    # ruff.toml
    extension = { qmd = "markdown" }
    pyproject.toml
    [tool.ruff]
    extension = { qmd = "markdown" }

  • Include configured extensions in file discovery (#23400)

  • [flake8-bandit] Allow suspicious imports in TYPE_CHECKING blocks (S401-S415) (#23441)

  • [flake8-bugbear] Allow B901 in pytest hook wrappers (#21931)

  • [flake8-import-conventions] Add missing conventions from upstream (ICN001, ICN002) (#21373)

... (truncated)

Commits
  • f14edd8 Bump 0.15.4 (#23595)
  • fd09d37 Fix panic on access to definitions after analyzing definitions (#23588)
  • 81d655f [pyflakes] suppress false positive in F821 for names used before del in...
  • 625b4f5 [ruff] docs: Clarify first-party import detection in Ruff (#23591)
  • 60facfa one word typo fix in a while_loop.md test case (#23589)
  • fbb9fa7 docs: fix incorrect import-heading example (#23568)
  • 5bc49a9 Increase the ruleset size to 16 bits (#23586)
  • a62ba8c [ty] Fix overloaded callable assignability for unary Callable targets (#23277)
  • e5f2f36 Bump 0.15.3 (#23585)
  • 0e19fc9 [ty] defer calculating conjunctions in narrowing constraints (#23552)
  • Additional commits viewable in compare view

Updates chromadb from 0.5.23 to 1.5.2

Release notes

Sourced from chromadb's releases.

1.5.2

Version: 1.5.2 Git ref: refs/tags/1.5.2 Build Date: 2026-02-27T19:50 PIP Package: chroma-1.5.2.tar.gz Github Container Registry Image: :1.5.2 DockerHub Image: :1.5.2

What's Changed

... (truncated)

Commits
  • 201b5b5 [RELEASE] Python 1.5.2 (#6513)
  • 2510bd4 ENH: batch sysdb queries in collection enrichment (#6496)
  • 517c6a8 [ENH] Pplx EF (#6511)
  • 721eb55 [CHORE] Set the tilt-up timeout for tilt ci to 10min. (#6508)
  • cab38cd Revert "[ENH] add tracing instrumentation to pull_logs (#6376)" (#6505)
  • 160f216 [CHORE] Add debug logging so we can see what the RLS sees when it throws bac...
  • 2ca9ef3 [TST] Don't run tests on markdown change (#6488)
  • fe2e807 [ENH] Dim some colors in dark mode (#6498)
  • 841ee34 [ENH] Move the filter by number of compactors to pre-enrichment. (#6487)
  • 1c1c7fc [CHORE][wal3] cut some recently added traces that exceed volume. (#6492)
  • Additional commits viewable in compare view

Updates llama-index-embeddings-openai from 0.5.1 to 0.5.2

Updates llama-index-llms-cohere from 0.6.1 to 0.7.1

Updates llama-index-llms-openai from 0.6.13 to 0.6.25

Updates llama-index-postprocessor-cohere-rerank from 0.5.1 to 0.7.0

Updates onnxruntime from 1.19.2 to 1.24.2

Release notes

Sourced from onnxruntime's releases.

ONNX Runtime v1.24.2

This is a patch release for ONNX Runtime 1.24, containing several bug fixes, security improvements, and execution provider updates.

Bug Fixes

  • NuGet: Fixed native library loading issues in the ONNX Runtime NuGet package on Linux and macOS. (#27266)
  • macOS: Fixed Java support and Jar testing on macOS ARM64. (#27271)
  • Core: Enable Robust Symlink Support for External Data for Huggingface Hub Cache. (#27374)
  • Core: Added boundary checks for SparseTensorProtoToDenseTensorProto to improve robustness. (#27323)
  • Security: Fixed an out-of-bounds read vulnerability in ArrayFeatureExtractor. (#27275)

Execution Provider Updates

  • MLAS: Fixed flakiness and accuracy issues in Lut GEMM (MatMulNBitsLutGemm). (#27216)
  • QNN: Enabled 64-bit UDMA mode for HTP target v81 or above. (#26677)
  • WebGPU:
    • Used LazyRelease for prepack allocator. (#27077)
    • Fixed ConvTranspose bias validation in both TypeScript and C++ implementations. (#27213)
  • OpenVINO (OVEP): Patch to reduce resident memory by reusing weight files across shared contexts. (#27238)
  • DNNL: Fixed DNNL build error by including missing files. (#27334)

Build and Infrastructure

  • CUDA:
    • Added support for CUDA architecture family codes (suffix 'f') introduced in CUDA 12.9. (#27278)
    • Fixed build errors and warnings for various CUDA versions (12.8, 13.0, 13.1.1). (#27276)
    • Applied patches for Abseil CUDA warnings. (#27096, #27126)
  • Pipelines:
    • Fixed Python packaging pipeline for Windows ARM64 and release. (#27339, #27350, #27299)
    • Fixed DirectML NuGet pipeline to correctly bundle x64 and ARM64 binaries for release. (#27349)
    • Updated Microsoft.ML.OnnxRuntime.Foundry package for Windows ARM64 support and NuGet signing. (#27294)
  • Testing: Updated BaseTester to support plugin EPs with both compiled nodes and registered kernels. (#27176)
  • Telemetry: Added service name and framework name to telemetry events for better usage understanding on Windows. (#27252, #27256)

Full Changelog: v1.24.1...v1.24.2

Contributors

@​tianleiwu, @​hariharans29, @​edgchen1, @​xiaofeihan1, @​adrianlizarraga, @​angelser, @​angelserMS, @​ankitm3k, @​baijumeswani, @​bmehta001, @​ericcraw, @​eserscor, @​fs-eire, @​guschmue, @​mc-nv, @​qjia7, @​qti-monumeen, @​titaiwangms, @​yuslepukhin

ONNX Runtime v1.24.1

📢 Announcements & Breaking Changes

Platform Support Changes

  • Python 3.10 wheels are no longer published — Please upgrade to Python 3.11+
  • Python 3.14 support added
  • Free-threaded Python (PEP 703) — Added support for Python 3.13t and 3.14t in Linux (#26786)
  • x86_64 binaries for macOS/iOS are no longer provided and minimum macOS is raised to 14.0

API Version

  • ORT_API_VERSION updated to 24 (#26418)

... (truncated)

Commits

Updates openai from 1.109.1 to 2.24.0

Release notes

Sourced from openai's releases.

v2.24.0

2.24.0 (2026-02-24)

Full Changelog: v2.23.0...v2.24.0

Features

Bug Fixes

Chores

  • internal: make test_proxy_environment_variables more resilient to env (65af8fd)
  • internal: refactor sse event parsing (2344600)

v2.23.0

2.23.0 (2026-02-24)

Full Changelog: v2.22.0...v2.23.0

Features

  • api: add gpt-realtime-1.5 and gpt-audio-1.5 model options to realtime calls (3300b61)

Chores

  • internal: make test_proxy_environment_variables more resilient (6b441e2)

v2.22.0

2.22.0 (2026-02-23)

Full Changelog: v2.21.0...v2.22.0

Features

  • api: websockets for responses api (c01f6fb)

Chores

  • internal: add request options to SSE classes (cdb4315)
  • update mock server docs (91f4da8)

... (truncated)

Changelog

Sourced from openai's changelog.

2.24.0 (2026-02-24)

Full Changelog: v2.23.0...v2.24.0

Features

Bug Fixes

Chores

  • internal: make test_proxy_environment_variables more resilient to env (65af8fd)
  • internal: refactor sse event parsing (2344600)

2.23.0 (2026-02-24)

Full Changelog: v2.22.0...v2.23.0

Features

  • api: add gpt-realtime-1.5 and gpt-audio-1.5 model options to realtime calls (3300b61)

Chores

  • internal: make test_proxy_environment_variables more resilient (6b441e2)

2.22.0 (2026-02-23)

Full Changelog: v2.21.0...v2.22.0

Features

  • api: websockets for responses api (c01f6fb)

Chores

  • internal: add request options to SSE classes (cdb4315)
  • update mock server docs (91f4da8)

Documentation

... (truncated)

Commits
  • 656e3ca release: 2.24.0 (#2890)
  • 921c330 release: 2.23.0
  • 650ccd9 codegen metadata
  • 9e9a4f1 feat(api): add gpt-realtime-1.5 and gpt-audio-1.5 model options to realtime c...
  • 588d239 chore(internal): make test_proxy_environment_variables more resilient
  • 481ff6e release: 2.22.0
  • e273d62 feat(api): websockets for responses api
  • c612cfb chore(internal): add request options to SSE classes
  • 849c8df docs(api): add batch size limit to file_batches parameter descriptions
  • 5e5bc78 docs(api): update safety_identifier documentation in chat completions and res...
  • Additional commits viewable in compare view

Updates opentelemetry-instrumentation-chromadb from 0.50.1 to 0.52.6

Release notes

Sourced from opentelemetry-instrumentation-chromadb's releases.

0.52.6

v0.52.6 (2026-02-26)

Fix

  • dataset: Add versions to dataset metadata (#3732)
  • qdrant: support all versions of qdrant package (#3500)

[main a78de64] bump: version 0.52.5 → 0.52.6 64 files changed, 70 insertions(+), 63 deletions(-)

0.52.5

v0.52.5 (2026-02-23)

Fix

  • traceloop-sdk: Add evaluator config to the evaluator validator (#3706)
  • anthropic: restore accidentally lost cache tokens attributes (#3648)

[main c2974c9] bump: version 0.52.4 → 0.52.5 64 files changed, 70 insertions(+), 63 deletions(-)

0.52.4

v0.52.4 (2026-02-19)

Fix

  • openai-agents: fix realtime session event handling for prompts, completions, and usage (#3688)
  • preserve return values for RealtimeSession context manager methods (#3681)
  • openai-agents: add functools.wraps to dont_throw decorator (#3687)

[main ae9c348] bump: version 0.52.3 → 0.52.4 64 files changed, 71 insertions(+), 63 deletions(-)

0.52.3

v0.52.3 (2026-02-10)

Fix

  • openai-agents: add clear flag to support two instrumentation modes (#3489)

[main d4d4269] bump: version 0.52.2 → 0.52.3 64 files changed, 69 insertions(+), 63 deletions(-)

0.52.2

v0.52.2 (2026-02-08)

Fix

  • traceloop-sdk: Add conversation decorator (#3659)

... (truncated)

Changelog

Sourced from opentelemetry-instrumentation-chromadb's changelog.

v0.52.6 (2026-02-26)

Fix

  • dataset: Add versions to dataset metadata (#3732)
  • qdrant: support all versions of qdrant package (#3500)

v0.52.5 (2026-02-23)

Fix

  • traceloop-sdk: Add evaluator config to the evaluator validator (#3706)
  • anthropic: restore accidentally lost cache tokens attributes (#3648)

v0.52.4 (2026-02-19)

Fix

  • openai-agents: fix realtime session event handling for prompts, completions, and usage (#3688)
  • preserve return values for RealtimeSession context manager methods (#3681)
  • openai-agents: add functools.wraps to dont_throw decorator (#3687)

v0.52.3 (2026-02-10)

Fix

  • openai-agents: add clear flag to support two instrumentation modes (#3489)

v0.52.2 (2026-02-08)

Fix

  • traceloop-sdk: Add conversation decorator (#3659)
  • traceloop-sdk: Add endpoint_is_traceloop attribute (#3650)

v0.52.1 (2026-02-02)

Fix

  • voyageai: add to commitizen to bump on release (#3660)

v0.52.0 (2026-02-02)

Feat

  • voyage-ai: add voyage-ai instrumentation (#3653)

Fix

  • openai-agents: apply content tracing flag to content (#3487)

... (truncated)

Commits
  • a78de64 bump: version 0.52.5 → 0.52.6
  • ab40640 fix(dataset): Add versions to dataset metadata (#3732)
  • 0353605 fix(qdrant): support all versions of qdrant package (#3500)
  • 65de782 fix(Watsonx):Correct the package Name to enable Traces (#3693)
  • c2974c9 bump: version 0.52.4 → 0.52.5
  • 4cd6a97 fix(traceloop-sdk): Add evaluator config to the evaluator validator (#3706)
  • f35f06b chore(deps): bump the gha group across 1 directory with 41 updates (#3670)
  • b202e35 chore(deps-dev): bump the gha group across 1 directory with 4 updates (#3690)
  • 35f9f2b chore(deps): bump pypdf from 6.6.2 to 6.7.1 in /packages/opentelemetry-instru...
  • 742aa61 chore(deps): bump the uv group across 2 directories with 5 updates (#3699)
  • Additional commits viewable in compare view

Updates opentelemetry-instrumentation-cohere from 0.50.1 to 0.52.6

Release notes

Sourced from opentelemetry-instrumentation-cohere's releases.

0.52.6

v0.52.6 (2026-02-26)

Fix

  • dataset: Add versions to dataset metadata (#3732)
  • qdrant: support all versions of qdrant package (#3500)

[main a78de64] bump: version 0.52.5 → 0.52.6 64 files changed, 70 insertions(+), 63 deletions(-)

0.52.5

v0.52.5 (2026-02-23)

Fix

  • traceloop-sdk: Add evaluator config to the evaluator validator (#3706)
  • anthropic: restore accidentally lost cache tokens attributes (#3648)

[main c2974c9] bump: version 0.52.4 → 0.52.5 64 files changed, 70 insertions(+), 63 deletions(-)

0.52.4

v0.52.4 (2026-02-19)

Fix

  • openai-agents: fix realtime session event handling for prompts, completions, and usage (#3688)
  • preserve return values for RealtimeSession context manager methods (#3681)
  • openai-agents: add functools.wraps to dont_throw decorator (#3687)

[main ae9c348] bump: version 0.52.3 → 0.52.4 64 files changed, 71 insertions(+), 63 deletions(-)

0.52.3

v0.52.3 (2026-02-10)

Fix

  • openai-agents: add clear flag to support two instrumentation modes (#3489)

[main d4d4269] bump: version 0.52.2 → 0.52.3 64 files changed, 69 insertions(+), 63 deletions(-)

0.52.2

v0.52.2 (2026-02-08)

Fix

  • traceloop-sdk: Add conversation decorator (#3659)

... (truncated)

Changelog

Sourced from opentelemetry-instrumentation-cohere's changelog.

v0.52.6 (2026-02-26)

Fix

  • dataset: Add versions to dataset metadata (#3732)
  • qdrant: support all versions of qdrant package (#3500)

v0.52.5 (2026-02-23)

Fix

  • traceloop-sdk: Add evaluator config to the evaluator validator (#3706)
  • anthropic: restore accidentally lost cache tokens attributes (#3648)

v0.52.4 (2026-02-19)

Fix

  • openai-agents: fix realtime session event handling for prompts, completions, and usage (#3688)
  • preserve return values for RealtimeSession context manager methods (#3681)
  • openai-agents: add functools.wraps to dont_throw decorator (#3687)

v0.52.3 (2026-02-10)

Fix

  • openai-agents: add clear flag to support two instrumentation modes (#3489)

v0.52.2 (2026-02-08)

Fix

  • traceloop-sdk: Add conversation decorator (#3659)
  • traceloop-sdk: Add endpoint_is_traceloop attribute (#3650)

v0.52.1 (2026-02-02)

Fix

  • voyageai: add to commitizen to bump on release (#3660)

v0.52.0 (2026-02-02)

Feat

  • voyage-ai: add voyage-ai instrumentation (#3653)

Fix

  • openai-agents: apply content tracing flag to content (#3487)

... (truncated)

Commits

@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Mar 4, 2026
Bumps the gha group with 15 updates in the /packages/opentelemetry-instrumentation-llamaindex directory:

| Package | From | To |
| --- | --- | --- |
| opentelemetry-semantic-conventions-ai | `0.4.13` | `0.4.15` |
| [llama-index](https://github.com/run-llama/llama_index) | `0.14.12` | `0.14.15` |
| [ruff](https://github.com/astral-sh/ruff) | `0.14.11` | `0.15.4` |
| [chromadb](https://github.com/chroma-core/chroma) | `0.5.23` | `1.5.2` |
| llama-index-embeddings-openai | `0.5.1` | `0.5.2` |
| llama-index-llms-cohere | `0.6.1` | `0.7.1` |
| llama-index-llms-openai | `0.6.13` | `0.6.25` |
| llama-index-postprocessor-cohere-rerank | `0.5.1` | `0.7.0` |
| [onnxruntime](https://github.com/microsoft/onnxruntime) | `1.19.2` | `1.24.2` |
| [openai](https://github.com/openai/openai-python) | `1.109.1` | `2.24.0` |
| [opentelemetry-instrumentation-chromadb](https://github.com/traceloop/openllmetry) | `0.50.1` | `0.52.6` |
| [opentelemetry-instrumentation-cohere](https://github.com/traceloop/openllmetry) | `0.50.1` | `0.52.6` |
| [opentelemetry-instrumentation-openai](https://github.com/traceloop/openllmetry) | `0.50.1` | `0.52.6` |
| [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) | `0.23.8` | `1.3.0` |
| [sqlalchemy](https://github.com/sqlalchemy/sqlalchemy) | `2.0.45` | `2.0.48` |



Updates `opentelemetry-semantic-conventions-ai` from 0.4.13 to 0.4.15

Updates `llama-index` from 0.14.12 to 0.14.15
- [Release notes](https://github.com/run-llama/llama_index/releases)
- [Changelog](https://github.com/run-llama/llama_index/blob/main/CHANGELOG.md)
- [Commits](run-llama/llama_index@v0.14.12...v0.14.15)

Updates `ruff` from 0.14.11 to 0.15.4
- [Release notes](https://github.com/astral-sh/ruff/releases)
- [Changelog](https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md)
- [Commits](astral-sh/ruff@0.14.11...0.15.4)

Updates `chromadb` from 0.5.23 to 1.5.2
- [Release notes](https://github.com/chroma-core/chroma/releases)
- [Changelog](https://github.com/chroma-core/chroma/blob/main/RELEASE_PROCESS.md)
- [Commits](chroma-core/chroma@0.5.23...1.5.2)

Updates `llama-index-embeddings-openai` from 0.5.1 to 0.5.2

Updates `llama-index-llms-cohere` from 0.6.1 to 0.7.1

Updates `llama-index-llms-openai` from 0.6.13 to 0.6.25

Updates `llama-index-postprocessor-cohere-rerank` from 0.5.1 to 0.7.0

Updates `onnxruntime` from 1.19.2 to 1.24.2
- [Release notes](https://github.com/microsoft/onnxruntime/releases)
- [Changelog](https://github.com/microsoft/onnxruntime/blob/main/docs/ReleaseManagement.md)
- [Commits](microsoft/onnxruntime@v1.19.2...v1.24.2)

Updates `openai` from 1.109.1 to 2.24.0
- [Release notes](https://github.com/openai/openai-python/releases)
- [Changelog](https://github.com/openai/openai-python/blob/main/CHANGELOG.md)
- [Commits](openai/openai-python@v1.109.1...v2.24.0)

Updates `opentelemetry-instrumentation-chromadb` from 0.50.1 to 0.52.6
- [Release notes](https://github.com/traceloop/openllmetry/releases)
- [Changelog](https://github.com/traceloop/openllmetry/blob/main/CHANGELOG.md)
- [Commits](0.50.1...0.52.6)

Updates `opentelemetry-instrumentation-cohere` from 0.50.1 to 0.52.6
- [Release notes](https://github.com/traceloop/openllmetry/releases)
- [Changelog](https://github.com/traceloop/openllmetry/blob/main/CHANGELOG.md)
- [Commits](0.50.1...0.52.6)

Updates `opentelemetry-instrumentation-openai` from 0.50.1 to 0.52.6
- [Release notes](https://github.com/traceloop/openllmetry/releases)
- [Changelog](https://github.com/traceloop/openllmetry/blob/main/CHANGELOG.md)
- [Commits](0.50.1...0.52.6)

Updates `pytest-asyncio` from 0.23.8 to 1.3.0
- [Release notes](https://github.com/pytest-dev/pytest-asyncio/releases)
- [Commits](pytest-dev/pytest-asyncio@v0.23.8...v1.3.0)

Updates `sqlalchemy` from 2.0.45 to 2.0.48
- [Release notes](https://github.com/sqlalchemy/sqlalchemy/releases)
- [Changelog](https://github.com/sqlalchemy/sqlalchemy/blob/main/CHANGES.rst)
- [Commits](https://github.com/sqlalchemy/sqlalchemy/commits)

---
updated-dependencies:
- dependency-name: opentelemetry-semantic-conventions-ai
  dependency-version: 0.4.15
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: gha
- dependency-name: llama-index
  dependency-version: 0.14.15
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: gha
- dependency-name: ruff
  dependency-version: 0.15.4
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: gha
- dependency-name: chromadb
  dependency-version: 1.5.2
  dependency-type: direct:development
  update-type: version-update:semver-major
  dependency-group: gha
- dependency-name: llama-index-embeddings-openai
  dependency-version: 0.5.2
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: gha
- dependency-name: llama-index-llms-cohere
  dependency-version: 0.7.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: gha
- dependency-name: llama-index-llms-openai
  dependency-version: 0.6.25
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: gha
- dependency-name: llama-index-postprocessor-cohere-rerank
  dependency-version: 0.7.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: gha
- dependency-name: onnxruntime
  dependency-version: 1.24.2
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: gha
- dependency-name: openai
  dependency-version: 2.24.0
  dependency-type: direct:development
  update-type: version-update:semver-major
  dependency-group: gha
- dependency-name: opentelemetry-instrumentation-chromadb
  dependency-version: 0.52.6
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: gha
- dependency-name: opentelemetry-instrumentation-cohere
  dependency-version: 0.52.6
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: gha
- dependency-name: opentelemetry-instrumentation-openai
  dependency-version: 0.52.6
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: gha
- dependency-name: pytest-asyncio
  dependency-version: 1.3.0
  dependency-type: direct:development
  update-type: version-update:semver-major
  dependency-group: gha
- dependency-name: sqlalchemy
  dependency-version: 2.0.48
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: gha
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot force-pushed the dependabot/uv/packages/opentelemetry-instrumentation-llamaindex/gha-b7dcf0291a branch from 6e132fe to 82f4b39 Compare March 4, 2026 07:52
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file

Projects

None yet

Development

Successfully merging this pull request may close these issues.

0 participants