readme Discovery Desktop desktop.md For Inference use vLLM or llama.cpp For Server vllm_deploy.md For PC llama-cpp.md Discovery Server discovery_setup.md Discovery UX discovery_ux.md Proxy Server Setup - for https proxy_setup_vm.md