Update llama.cpp vectors to support default length and batch parameters. - Default `n_ctx` to `maxlength` if available. Otherwise default `n_ctx=0`, which sets `n_ctx=n_ctx_train`. - Default `n_batch` to `encodebatch`