Skip to content

Link https://abetlen.github.io/llama-cpp-python/whl/cu125 returns 404 Not Found. #2127

@wimtdw

Description

@wimtdw

Link https://abetlen.github.io/llama-cpp-python/whl/cu125 returns 404 Not Found.
I've been following instructions in readme on how to install a pre-built wheel with CUDA support. Links for versions 12.1-12.4 work fine, but not for 12.5.

Readme section:

"Pre-built Wheel (New)

It is also possible to install a pre-built wheel with CUDA support. As long as your system meets some requirements:

  • CUDA Version is 12.1, 12.2, 12.3, 12.4 or 12.5
  • Python Version is 3.10, 3.11 or 3.12
pip install llama-cpp-python \
  --extra-index-url https://abetlen.github.io/llama-cpp-python/whl/<cuda-version>

Where <cuda-version> is one of the following:

  • cu121: CUDA 12.1
  • cu122: CUDA 12.2
  • cu123: CUDA 12.3
  • cu124: CUDA 12.4
  • cu125: CUDA 12.5

For example, to install the CUDA 12.1 wheel:

pip install llama-cpp-python \
  --extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cu121"

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions