We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tested in b2724 and b2781 (latest)
start server
./server -m ~/.tabby/models/TabbyML/CodeQwen-7B/ggml/q8_0.v2.gguf -c 2048 --port 8081
curl --request POST \ --url http://localhost:8081/completion \ --header 'content-type: application/json' \ --data '{"prompt": "<fim_prefix>// Path: crates/tabby/src/main.rs\n// /// Download the language model for serving.\n// Download(download::DownloadArgs),\n//\n// Path: crates/tabby/src/main.rs\n// #[derive(Subcommand)]\n// pub enum Commands {\n// /// Starts the api endpoint for IDE / Editor extensions.\n// Serve(serve::ServeArgs),\n//\n// /// Download the language model for serving.\n// Download(download::DownloadArgs),\n//\n// /// Run scheduler progress for cron jobs integrating external code repositories.\n// Scheduler(SchedulerArgs),\n//\n// /// Run completion model as worker\n// #[cfg(feature = \"ee\")]\n// #[clap(name = \"worker::completion\", hide = true)]\n// WorkerCompletion(worker::WorkerArgs),\n//\n Commands::Scheduler(SchedulerArgs {\n now,\n url: Some(url),\n token: Some(token),\n }) => {\n let client = tabby_webserver::public::create_scheduler_client(&url, &token).await;\n tabby_scheduler::scheduler(now, client).await\n }\n Commands::Scheduler(SchedulerArgs { now, .. }) => {\n tabby_scheduler::scheduler(now, ConfigRepositoryAccess).await\n }\n #[cfg(feature = \"ee\")]\n Commands::WorkerCompletion(ref args) => {\n worker::main(tabby_webserver::public::WorkerKind::Completion, args).await\n }\n #[cfg(feature = \"ee\")]\n Commands::WorkerChat(ref args) => {\n worker::main(tabby_webserver::public::WorkerKind::Chat, args).await\n }\n Commands::Download(<fim_suffix>)\n }\n}\n\n#[macro_export]\nmacro_rules! fatal {\n ($msg:expr) => {\n ({\n tracing::error!($msg);\n std::process::exit(1);\n })\n };\n\n ($fmt:expr, $($arg:tt)*) => {\n ({\n tracing::error!($fmt, $($arg)*);\n std::process::exit(1);\n })\n };\n}\n<fim_middle>", "n_predict": 12}'
Output:
{ "content": " ref args) => {\n tabby_webs", ... }
maybe related: The generated token has a _ prefix in HF vocab file.
_
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Tested in b2724 and b2781 (latest)
start server
Output:
maybe related: The generated token has a
_
prefix in HF vocab file.The text was updated successfully, but these errors were encountered: