Skip to content

Conversation

@mudler
Copy link
Owner

@mudler mudler commented Jul 22, 2025

Description

CI currently fails building flash-attn for cuda 11 and vllm.This PR is about dropping vllm for cuda11. We can try later to re-add it if there is a way to make it work again.

Notes for Reviewers

Signed commits

  • Yes, I signed my commits.

Signed-off-by: Ettore Di Giacinto <[email protected]>
@mudler mudler merged commit 9b80625 into master Jul 22, 2025
18 of 19 checks passed
@mudler mudler deleted the fix/vllm-cuda-11 branch July 22, 2025 16:47
@netlify
Copy link

netlify bot commented Jul 22, 2025

Deploy Preview for localai ready!

Name Link
🔨 Latest commit 59ef92a
🔍 Latest deploy log https://app.netlify.com/projects/localai/deploys/687fc09b1b86550008c5525c
😎 Deploy Preview https://deploy-preview-5881--localai.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify project configuration.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants