FlashInfer Python Wheels for CUDA 12.4 + torch 2.5.0

flashinfer_python-0.2.0.post2+cu124torch2.5-cp310-cp310-linux_x86_64.whl
flashinfer_python-0.2.0.post2+cu124torch2.5-cp311-cp311-linux_x86_64.whl
flashinfer_python-0.2.0.post2+cu124torch2.5-cp312-cp312-linux_x86_64.whl
flashinfer_python-0.2.0.post2+cu124torch2.5-cp39-cp39-linux_x86_64.whl
flashinfer_python-0.2.1.post1+cu124torch2.5-cp38-abi3-linux_x86_64.whl
flashinfer_python-0.2.1.post2+cu124torch2.5-cp38-abi3-linux_x86_64.whl
flashinfer_python-0.2.2+cu124torch2.5-cp38-abi3-linux_x86_64.whl
flashinfer_python-0.2.2.post1+cu124torch2.5-cp38-abi3-linux_x86_64.whl
flashinfer_python-0.2.3+cu124torch2.5-cp38-abi3-linux_x86_64.whl
flashinfer_python-0.2.4+cu124torch2.5-cp38-abi3-linux_x86_64.whl