llama.cpp/scripts/git-bisect-run.sh
Johannes Gäßler a976ff081b
llama: end-to-end tests (#19802)
* tests: add end-to-end tests per model architecture

* fixup for rebase

* fix use-after-free in llama-model-loader.cpp

* fix CI

* fix WebGPU

* fix CI

* disable CI for macOS-latest-cmake-arm64

* use expert_weights_scale only if != 0.0f

* comments
2026-03-08 12:30:21 +01:00

19 lines
407 B
Bash
Executable File

#!/usr/bin/env bash
cmake_args=()
llama_results_args=()
for arg in "${@}"; do
if [[ "$arg" == -D* ]]; then
cmake_args+=("$arg")
else
llama_results_args+=("$arg")
fi
done
dir="build-bisect"
rm -rf ${dir} > /dev/null
cmake -B ${dir} -S . ${cmake_args} > /dev/null
cmake --build ${dir} -t llama-results -j $(nproc) > /dev/null
${dir}/bin/llama-results "${llama_results_args[@]}"