Skip to content

OVMS service starts and then shuts down complaining about config file #4113

@Tre-vor-W

Description

@Tre-vor-W

Describe the bug
OVMS service starts and then shuts down complaining about config file - failed to open fro debug read

To Reproduce
Steps to reproduce the behavior:

  1. Steps to prepare models repository '...'
  2. OVMS launch command '"C:\Program Files (x86)\Intel\ovms\ovms.exe" --rest_port 8000 --config_path c:\Users\FLO.lmstudio\models --log_level INFO --log_path "C:\Program Files (x86)\Intel\ovms\ovms_server.log"'
  3. Client command (additionally client code if not using official client or demo) '....'
  4. See error
  5. [2026-04-05 20:47:18.393][9244][serving][info][server.cpp:88] OpenVINO Model Server 2025.4.1.7bc56cf8
    [2026-04-05 20:47:18.393][9244][serving][info][server.cpp:89] OpenVINO backend 2025.4.1.0rc1
    [2026-04-05 20:47:18.394][9244][serving][info][pythoninterpretermodule.cpp:37] PythonInterpreterModule starting
    [2026-04-05 20:47:18.459][9244][serving][info][pythoninterpretermodule.cpp:50] PythonInterpreterModule started
    [2026-04-05 20:47:18.700][9244][modelmanager][info][modelmanager.cpp:156] Available devices for Open VINO: CPU, GPU
    [2026-04-05 20:47:18.702][9244][serving][info][capimodule.cpp:40] C-APIModule starting
    [2026-04-05 20:47:18.702][9244][serving][info][capimodule.cpp:42] C-APIModule started
    [2026-04-05 20:47:18.702][9244][serving][info][grpcservermodule.cpp:110] GRPCServerModule starting
    [2026-04-05 20:47:18.702][9244][serving][info][grpcservermodule.cpp:114] GRPCServerModule started
    [2026-04-05 20:47:18.703][9244][serving][info][grpcservermodule.cpp:115] Port was not set. GRPC server will not be started.
    [2026-04-05 20:47:18.703][9244][serving][info][httpservermodule.cpp:35] HTTPServerModule starting
    [2026-04-05 20:47:18.703][9244][serving][info][httpservermodule.cpp:39] Will start 18 REST workers
    [2026-04-05 20:47:18.704][6552][serving][info][drogon_http_server.cpp:137] Binding REST server to address: 0.0.0.0:8000
    [2026-04-05 20:47:18.765][9244][serving][info][drogon_http_server.cpp:167] REST server listening on port 8000 with 18 unary threads and 18 streaming threads
    [2026-04-05 20:47:18.765][9244][serving][info][http_server.cpp:248] API key not provided via --api_key_file or API_KEY environment variable. Authentication will be disabled.
    [2026-04-05 20:47:18.768][9244][serving][info][httpservermodule.cpp:52] HTTPServerModule started
    [2026-04-05 20:47:18.768][9244][serving][info][httpservermodule.cpp:53] Started REST server at 0.0.0.0:8000
    [2026-04-05 20:47:18.769][9244][serving][info][servablemanagermodule.cpp:51] ServableManagerModule starting
    [2026-04-05 20:47:18.769][9244][serving][error][schema.cpp:524] Failed to open file for debug-read on Windows
    [2026-04-05 20:47:18.769][9244][serving][error][schema.cpp:570] Configuration file is invalid c:\Users\FLO.lmstudio\models
    [2026-04-05 20:47:18.781][9244][serving][error][schema.cpp:524] Failed to open file for debug-read on Windows
    [2026-04-05 20:47:18.782][9244][serving][error][schema.cpp:570] Configuration file is invalid c:\Users\FLO.lmstudio\models
    [2026-04-05 20:47:18.797][9244][serving][error][schema.cpp:524] Failed to open file for debug-read on Windows
    [2026-04-05 20:47:18.798][9244][serving][error][schema.cpp:570] Configuration file is invalid c:\Users\FLO.lmstudio\models
    [2026-04-05 20:47:18.813][9244][modelmanager][error][modelmanager.cpp:184] Couldn't start model manager
    [2026-04-05 20:47:18.814][9244][serving][error][servablemanagermodule.cpp:58] ovms::ModelManager::Start() Error: Configuration file not found or cannot open
    [2026-04-05 20:47:18.814][9244][serving][info][grpcservermodule.cpp:201] GRPCServerModule shutting down
    [2026-04-05 20:47:18.814][9244][serving][info][grpcservermodule.cpp:211] GRPCServerModule shutdown
    [2026-04-05 20:47:18.814][9244][serving][info][httpservermodule.cpp:59] HTTPServerModule shutting down
    [2026-04-05 20:47:18.820][9244][serving][info][httpservermodule.cpp:64] Shutdown HTTP server
    [2026-04-05 20:47:18.820][9244][serving][info][servablemanagermodule.cpp:65] ServableManagerModule shutting down
    [2026-04-05 20:47:18.826][9244][serving][info][servablemanagermodule.cpp:71] ServableManagerModule shutdown
    [2026-04-05 20:47:18.826][9244][serving][info][pythoninterpretermodule.cpp:61] PythonInterpreterModule shutting down
    [2026-04-05 20:47:18.826][9244][serving][info][pythoninterpretermodule.cpp:65] PythonInterpreterModule shutdown
    [2026-04-05 20:47:18.831][9244][serving][info][capimodule.cpp:50] C-APIModule shutting down
    [2026-04-05 20:47:18.832][9244][serving][info][capimodule.cpp:52] C-APIModule shutdown

Expected behavior
service should start and continue running

Logs
Logs from OVMS, ideally with --log_level DEBUG. Logs from client.
C:\Windows\System32>"C:\Program Files (x86)\Intel\ovms\ovms.exe" --rest_port 8000 --config_path c:\Users\FLO.lmstudio\models --log_level DEBUG --log_path "C:\Program Files (x86)\Intel\ovms\ovms_server.log"
[2026-04-05 21:03:57.844][5984][serving][info][server.cpp:88] OpenVINO Model Server 2025.4.1.7bc56cf8
[2026-04-05 21:03:57.844][5984][serving][info][server.cpp:89] OpenVINO backend 2025.4.1.0rc1
[2026-04-05 21:03:57.845][5984][serving][debug][server.cpp:90] CLI parameters passed to ovms server
[2026-04-05 21:03:57.845][5984][serving][debug][server.cpp:112] config_path: c:\Users\FLO.lmstudio\models
[2026-04-05 21:03:57.845][5984][serving][debug][server.cpp:114] gRPC port: 0
[2026-04-05 21:03:57.846][5984][serving][debug][server.cpp:115] REST port: 8000
[2026-04-05 21:03:57.846][5984][serving][debug][server.cpp:116] gRPC bind address: 0.0.0.0
[2026-04-05 21:03:57.846][5984][serving][debug][server.cpp:117] REST bind address: 0.0.0.0
[2026-04-05 21:03:57.846][5984][serving][debug][server.cpp:118] REST workers: 18
[2026-04-05 21:03:57.846][5984][serving][debug][server.cpp:119] gRPC workers: 1
[2026-04-05 21:03:57.846][5984][serving][debug][server.cpp:120] gRPC channel arguments:
[2026-04-05 21:03:57.846][5984][serving][debug][server.cpp:121] log level: DEBUG
[2026-04-05 21:03:57.846][5984][serving][debug][server.cpp:122] log path: C:\Program Files (x86)\Intel\ovms\ovms_server.log
[2026-04-05 21:03:57.846][5984][serving][debug][server.cpp:124] file system poll wait milliseconds: 1000
[2026-04-05 21:03:57.847][5984][serving][debug][server.cpp:125] sequence cleaner poll wait minutes: 5
[2026-04-05 21:03:57.847][5984][serving][debug][server.cpp:126] model_repository_path:
[2026-04-05 21:03:57.847][5984][serving][info][pythoninterpretermodule.cpp:37] PythonInterpreterModule starting
Python version:
3.12.10 (tags/v3.12.10:0cc8128, Apr 8 2025, 12:21:36) [MSC v.1943 64 bit (AMD64)]
Python sys.path output:
['', 'c:\Program Files (x86)\Intel\ovms\python\python312', 'c:\Program Files (x86)\Intel\ovms\python', 'c:\Program Files (x86)\Intel\ovms\python\Scripts', 'c:\Program Files (x86)\Intel\ovms\python\Lib\site-packages']
[2026-04-05 21:03:57.912][5984][serving][debug][python_backend.cpp:46] Creating python backend
[2026-04-05 21:03:57.915][5984][serving][info][pythoninterpretermodule.cpp:50] PythonInterpreterModule started
[2026-04-05 21:03:57.917][5984][modelmanager][debug][mediapipefactory.cpp:52] Registered Calculators: AddHeaderCalculator, AlignmentPointsRectsCalculator, AnnotationOverlayCalculator, AnomalyCalculator, AnomalySerializationCalculator, AssociationNormRectCalculator, BeginLoopDetectionCalculator, BeginLoopFloatCalculator, BeginLoopGpuBufferCalculator, BeginLoopImageCalculator, BeginLoopImageFrameCalculator, BeginLoopIntCalculator, BeginLoopMatrixCalculator, BeginLoopMatrixVectorCalculator, BeginLoopModelApiDetectionCalculator, BeginLoopNormalizedLandmarkListVectorCalculator, BeginLoopNormalizedRectCalculator, BeginLoopRectanglePredictionCalculator, BeginLoopStringCalculator, BeginLoopTensorCalculator, BeginLoopUint64tCalculator, BoxDetectorCalculator, BoxTrackerCalculator, CallbackCalculator, CallbackPacketCalculator, CallbackWithHeaderCalculator, ClassificationCalculator, ClassificationListVectorHasMinSizeCalculator, ClassificationListVectorSizeCalculator, ClassificationSerializationCalculator, ClipDetectionVectorSizeCalculator, ClipNormalizedRectVectorSizeCalculator, ColorConvertCalculator, ConcatenateBoolVectorCalculator, ConcatenateClassificationListCalculator, ConcatenateClassificationListVectorCalculator, ConcatenateDetectionVectorCalculator, ConcatenateFloatVectorCalculator, ConcatenateImageVectorCalculator, ConcatenateInt32VectorCalculator, ConcatenateJointListCalculator, ConcatenateLandmarListVectorCalculator, ConcatenateLandmarkListCalculator, ConcatenateLandmarkListVectorCalculator, ConcatenateLandmarkVectorCalculator, ConcatenateNormalizedLandmarkListCalculator, ConcatenateNormalizedLandmarkListVectorCalculator, ConcatenateRenderDataVectorCalculator, ConcatenateStringVectorCalculator, ConcatenateTensorVectorCalculator, ConcatenateTfLiteTensorVectorCalculator, ConcatenateUInt64VectorCalculator, ConstantSidePacketCalculator, CountingSourceCalculator, CropCalculator, DefaultSidePacketCalculator, DequantizeByteArrayCalculator, DetectionCalculator, DetectionClassificationCombinerCalculator, DetectionClassificationResultCalculator, DetectionClassificationSerializationCalculator, DetectionExtractionCalculator, DetectionLabelIdToTextCalculator, DetectionLetterboxRemovalCalculator, DetectionProjectionCalculator, DetectionSegmentationCombinerCalculator, DetectionSegmentationResultCalculator, DetectionSegmentationSerializationCalculator, DetectionSerializationCalculator, DetectionsToRectsCalculator, DetectionsToRenderDataCalculator, EmbeddingsCalculatorOV, EmptyLabelCalculator, EmptyLabelClassificationCalculator, EmptyLabelDetectionCalculator, EmptyLabelRotatedDetectionCalculator, EmptyLabelSegmentationCalculator, EndLoopAffineMatrixCalculator, EndLoopBooleanCalculator, EndLoopClassificationListCalculator, EndLoopDetectionCalculator, EndLoopFloatCalculator, EndLoopGpuBufferCalculator, EndLoopImageCalculator, EndLoopImageFrameCalculator, EndLoopImageSizeCalculator, EndLoopLandmarkListVectorCalculator, EndLoopMatrixCalculator, EndLoopModelApiDetectionClassificationCalculator, EndLoopModelApiDetectionSegmentationCalculator, EndLoopNormalizedLandmarkListVectorCalculator, EndLoopNormalizedRectCalculator, EndLoopPolygonPredictionsCalculator, EndLoopRectanglePredictionsCalculator, EndLoopRenderDataCalculator, EndLoopTensorCalculator, EndLoopTfLiteTensorCalculator, FaceLandmarksToRenderDataCalculator, FeatureDetectorCalculator, FlowLimiterCalculator, FlowPackagerCalculator, FlowToImageCalculator, FromImageCalculator, GateCalculator, GetClassificationListVectorItemCalculator, GetDetectionVectorItemCalculator, GetLandmarkListVectorItemCalculator, GetNormalizedLandmarkListVectorItemCalculator, GetNormalizedRectVectorItemCalculator, GetRectVectorItemCalculator, GraphProfileCalculator, HandDetectionsFromPoseToRectsCalculator, HandLandmarksToRectCalculator, HttpLLMCalculator, HttpSerializationCalculator, ImageCloneCalculator, ImageCroppingCalculator, ImageGenCalculator, ImagePropertiesCalculator, ImageToTensorCalculator, ImageTransformationCalculator, ImmediateMuxCalculator, InferenceCalculatorCpu, InstanceSegmentationCalculator, InverseMatrixCalculator, IrisToRenderDataCalculator, KeypointDetectionCalculator, LandmarkLetterboxRemovalCalculator, LandmarkListVectorSizeCalculator, LandmarkProjectionCalculator, LandmarkVisibilityCalculator, LandmarksRefinementCalculator, LandmarksSmoothingCalculator, LandmarksToDetectionCalculator, LandmarksToRenderDataCalculator, MakePairCalculator, MatrixMultiplyCalculator, MatrixSubtractCalculator, MatrixToVectorCalculator, MediaPipeInternalSidePacketToPacketStreamCalculator, MergeCalculator, MergeDetectionsToVectorCalculator, MergeGpuBuffersToVectorCalculator, MergeImagesToVectorCalculator, ModelInferHttpRequestCalculator, ModelInferRequestImageCalculator, MotionAnalysisCalculator, MuxCalculator, NonMaxSuppressionCalculator, NonZeroCalculator, NormalizedLandmarkListVectorHasMinSizeCalculator, NormalizedRectVectorHasMinSizeCalculator, OpenCvEncodedImageToImageFrameCalculator, OpenCvImageEncoderCalculator, OpenCvPutTextCalculator, OpenCvVideoDecoderCalculator, OpenCvVideoEncoderCalculator, OpenVINOConverterCalculator, OpenVINOInferenceAdapterCalculator, OpenVINOInferenceCalculator, OpenVINOModelServerSessionCalculator, OpenVINOTensorsToClassificationCalculator, OpenVINOTensorsToDetectionsCalculator, OverlayCalculator, PacketGeneratorWrapperCalculator, PacketInnerJoinCalculator, PacketPresenceCalculator, PacketResamplerCalculator, PacketSequencerCalculator, PacketThinnerCalculator, PassThroughCalculator, PreviousLoopbackCalculator, PyTensorOvTensorConverterCalculator, PythonExecutorCalculator, QuantizeFloatVectorCalculator, RectToRenderDataCalculator, RectToRenderScaleCalculator, RectTransformationCalculator, RefineLandmarksFromHeatmapCalculator, RerankCalculator, RerankCalculatorOV, ResourceProviderCalculator, RoiTrackingCalculator, RotatedDetectionCalculator, RotatedDetectionSerializationCalculator, RoundRobinDemuxCalculator, S2tCalculator, SegmentationCalculator, SegmentationSerializationCalculator, SegmentationSmoothingCalculator, SequenceShiftCalculator, SerializationCalculator, SetLandmarkVisibilityCalculator, SidePacketToStreamCalculator, SplitAffineMatrixVectorCalculator, SplitClassificationListVectorCalculator, SplitDetectionVectorCalculator, SplitFloatVectorCalculator, SplitImageVectorCalculator, SplitJointListCalculator, SplitLandmarkListCalculator, SplitLandmarkVectorCalculator, SplitMatrixVectorCalculator, SplitNormalizedLandmarkListCalculator, SplitNormalizedLandmarkListVectorCalculator, SplitNormalizedRectVectorCalculator, SplitTensorVectorCalculator, SplitTfLiteTensorVectorCalculator, SplitUint64tVectorCalculator, SsdAnchorsCalculator, StreamToSidePacketCalculator, StringToInt32Calculator, StringToInt64Calculator, StringToIntCalculator, StringToUint32Calculator, StringToUint64Calculator, StringToUintCalculator, SwitchDemuxCalculator, SwitchMuxCalculator, T2sCalculator, TensorsToClassificationCalculator, TensorsToDetectionsCalculator, TensorsToFloatsCalculator, TensorsToLandmarksCalculator, TensorsToSegmentationCalculator, TfLiteConverterCalculator, TfLiteCustomOpResolverCalculator, TfLiteInferenceCalculator, TfLiteModelCalculator, TfLiteTensorsToDetectionsCalculator, TfLiteTensorsToFloatsCalculator, TfLiteTensorsToLandmarksCalculator, ThresholdingCalculator, ToImageCalculator, TrackedDetectionManagerCalculator, UpdateFaceLandmarksCalculator, VideoPreStreamCalculator, VisibilityCopyCalculator, VisibilitySmoothingCalculator, WarpAffineCalculator, WarpAffineCalculatorCpu, WorldLandmarkProjectionCalculator

[2026-04-05 21:03:57.917][5984][modelmanager][debug][mediapipefactory.cpp:52] Registered Subgraphs: FaceDetection, FaceDetectionFrontDetectionToRoi, FaceDetectionFrontDetectionsToRoi, FaceDetectionShortRange, FaceDetectionShortRangeByRoiCpu, FaceDetectionShortRangeCpu, FaceLandmarkCpu, FaceLandmarkFrontCpu, FaceLandmarkLandmarksToRoi, FaceLandmarksFromPoseCpu, FaceLandmarksFromPoseToRecropRoi, FaceLandmarksModelLoader, FaceLandmarksToRoi, FaceTracking, HandLandmarkCpu, HandLandmarkModelLoader, HandLandmarksFromPoseCpu, HandLandmarksFromPoseToRecropRoi, HandLandmarksLeftAndRightCpu, HandLandmarksToRoi, HandRecropByRoiCpu, HandTracking, HandVisibilityFromHandLandmarksFromPose, HandWristForPose, HolisticLandmarkCpu, HolisticTrackingToRenderData, InferenceCalculator, IrisLandmarkCpu, IrisLandmarkLandmarksToRoi, IrisLandmarkLeftAndRightCpu, IrisRendererCpu, PoseDetectionCpu, PoseDetectionToRoi, PoseLandmarkByRoiCpu, PoseLandmarkCpu, PoseLandmarkFiltering, PoseLandmarkModelLoader, PoseLandmarksAndSegmentationInverseProjection, PoseLandmarksToRoi, PoseSegmentationFiltering, SwitchContainer, TensorsToFaceLandmarks, TensorsToFaceLandmarksWithAttention, TensorsToPoseLandmarksAndSegmentation

[2026-04-05 21:03:57.918][5984][modelmanager][debug][mediapipefactory.cpp:52] Registered InputStreamHandlers: BarrierInputStreamHandler, DefaultInputStreamHandler, EarlyCloseInputStreamHandler, FixedSizeInputStreamHandler, ImmediateInputStreamHandler, MuxInputStreamHandler, SyncSetInputStreamHandler, TimestampAlignInputStreamHandler

[2026-04-05 21:03:57.918][5984][modelmanager][debug][mediapipefactory.cpp:52] Registered OutputStreamHandlers: InOrderOutputStreamHandler

[2026-04-05 21:03:58.151][5984][modelmanager][info][modelmanager.cpp:156] Available devices for Open VINO: CPU, GPU
[2026-04-05 21:03:58.151][5984][modelmanager][debug][ov_utils.hpp:56] Logging OpenVINO Core plugin: CPU; plugin configuration
[2026-04-05 21:03:58.153][5984][modelmanager][debug][ov_utils.hpp:91] OpenVINO Core plugin: CPU; plugin configuration: { AVAILABLE_DEVICES: , CPU_DENORMALS_OPTIMIZATION: NO, CPU_SPARSE_WEIGHTS_DECOMPRESSION_RATE: 1, DEVICE_ARCHITECTURE: intel64, DEVICE_ID: , DEVICE_TYPE: integrated, DYNAMIC_QUANTIZATION_GROUP_SIZE: 32, ENABLE_CPU_PINNING: YES, ENABLE_CPU_RESERVATION: NO, ENABLE_HYPER_THREADING: YES, ENABLE_TENSOR_PARALLEL: NO, ENABLE_WEIGHTLESS: NO, EXECUTION_DEVICES: CPU, EXECUTION_MODE_HINT: PERFORMANCE, FULL_DEVICE_NAME: Intel(R) Xeon(R) CPU E5-2650 v4 @ 2.20GHz, INFERENCE_NUM_THREADS: 0, INFERENCE_PRECISION_HINT: f32, KEY_CACHE_GROUP_SIZE: 0, KEY_CACHE_PRECISION: u8, KV_CACHE_PRECISION: u8, LOG_LEVEL: LOG_NONE, MODEL_DISTRIBUTION_POLICY: , NUM_STREAMS: 1, OPTIMIZATION_CAPABILITIES: FP32 INT8 BIN EXPORT_IMPORT, PERFORMANCE_HINT: LATENCY, PERFORMANCE_HINT_NUM_REQUESTS: 0, PERF_COUNT: NO, RANGE_FOR_ASYNC_INFER_REQUESTS: 1 1 1, RANGE_FOR_STREAMS: 1 18, SCHEDULING_CORE_TYPE: ANY_CORE, VALUE_CACHE_GROUP_SIZE: 0, VALUE_CACHE_PRECISION: u8, WEIGHTS_PATH: }
[2026-04-05 21:03:58.153][5984][modelmanager][debug][ov_utils.hpp:56] Logging OpenVINO Core plugin: GPU; plugin configuration
[2026-04-05 21:03:58.154][5984][modelmanager][debug][ov_utils.hpp:91] OpenVINO Core plugin: GPU; plugin configuration: { ACTIVATIONS_SCALE_FACTOR: -1, AVAILABLE_DEVICES: 0, CACHE_DIR: , CACHE_ENCRYPTION_CALLBACKS: , CACHE_MODE: optimize_speed, COMPILATION_NUM_THREADS: 18, CONFIG_FILE: , DEVICE_ARCHITECTURE: GPU: vendor=0x8086 arch=v20.1.0, DEVICE_GOPS: {f16:116736,f32:14592,i8:233472,u8:233472}, DEVICE_ID: 0, DEVICE_LUID: 8981000000000000, DEVICE_PCI_INFO: {domain: 0 bus: 6 device: 0x10 function: 0}, DEVICE_TYPE: discrete, DEVICE_UUID: 86800be2000000000610000000000000, DYNAMIC_QUANTIZATION_GROUP_SIZE: 0, ENABLE_CPU_PINNING: NO, ENABLE_CPU_RESERVATION: NO, EXECUTION_MODE_HINT: PERFORMANCE, FULL_DEVICE_NAME: Intel(R) Arc(TM) B580 Graphics (dGPU), GPU_DEVICE_ID: 0xe20b, GPU_DEVICE_MAX_ALLOC_MEM_SIZE: 12452507648, GPU_DEVICE_TOTAL_MEM_SIZE: 12452507648, GPU_DISABLE_WINOGRAD_CONVOLUTION: NO, GPU_ENABLE_LOOP_UNROLLING: YES, GPU_ENABLE_LORA_OPERATION: YES, GPU_ENABLE_SDPA_OPTIMIZATION: YES, GPU_EXECUTION_UNITS_COUNT: 160, GPU_HOST_TASK_PRIORITY: MEDIUM, GPU_MEMORY_STATISTICS: {cl_mem:0,unknown:0,usm_device:0,usm_host:0,usm_shared:0}, GPU_QUEUE_PRIORITY: MEDIUM, GPU_QUEUE_THROTTLE: MEDIUM, GPU_UARCH_VERSION: 20.1.0, INFERENCE_PRECISION_HINT: f16, KV_CACHE_PRECISION: dynamic, MAX_BATCH_SIZE: 1, MODEL_PRIORITY: MEDIUM, MODEL_PTR: 0000000000000000, NUM_STREAMS: 1, OPTIMAL_BATCH_SIZE: 1, OPTIMIZATION_CAPABILITIES: FP32 BIN FP16 INT8 GPU_HW_MATMUL GPU_USM_MEMORY EXPORT_IMPORT, PERFORMANCE_HINT: LATENCY, PERFORMANCE_HINT_NUM_REQUESTS: 0, PERF_COUNT: NO, RANGE_FOR_ASYNC_INFER_REQUESTS: 1 2 1, RANGE_FOR_STREAMS: 1 2, WEIGHTS_PATH: }
[2026-04-05 21:03:58.154][5984][serving][info][capimodule.cpp:40] C-APIModule starting
[2026-04-05 21:03:58.155][5984][serving][info][capimodule.cpp:42] C-APIModule started
[2026-04-05 21:03:58.155][5984][serving][info][grpcservermodule.cpp:110] GRPCServerModule starting
[2026-04-05 21:03:58.155][5984][serving][info][grpcservermodule.cpp:114] GRPCServerModule started
[2026-04-05 21:03:58.155][5984][serving][info][grpcservermodule.cpp:115] Port was not set. GRPC server will not be started.
[2026-04-05 21:03:58.155][5984][serving][info][httpservermodule.cpp:35] HTTPServerModule starting
[2026-04-05 21:03:58.155][5984][serving][info][httpservermodule.cpp:39] Will start 18 REST workers
[2026-04-05 21:03:58.155][5984][serving][debug][drogon_http_server.cpp:41] Starting http thread pool for streaming (18 threads)
[2026-04-05 21:03:58.156][5984][serving][debug][drogon_http_server.cpp:43] Thread pool started
[2026-04-05 21:03:58.156][5984][serving][debug][drogon_http_server.cpp:67] DrogonHttpServer::startAcceptingRequests()
[2026-04-05 21:03:58.157][5984][serving][debug][drogon_http_server.cpp:154] Waiting for drogon to become ready on port 8000...
[2026-04-05 21:03:58.157][2264][serving][debug][drogon_http_server.cpp:103] Starting to listen on port 8000
[2026-04-05 21:03:58.157][2264][serving][debug][drogon_http_server.cpp:104] Thread pool size for unary (18 drogon threads)
[2026-04-05 21:03:58.157][2264][serving][info][drogon_http_server.cpp:137] Binding REST server to address: 0.0.0.0:8000
[2026-04-05 21:03:58.216][5984][serving][debug][drogon_http_server.cpp:163] Drogon run procedure took: 59.506 ms
[2026-04-05 21:03:58.216][5984][serving][info][drogon_http_server.cpp:167] REST server listening on port 8000 with 18 unary threads and 18 streaming threads
[2026-04-05 21:03:58.218][5984][serving][info][http_server.cpp:248] API key not provided via --api_key_file or API_KEY environment variable. Authentication will be disabled.
[2026-04-05 21:03:58.220][5984][serving][info][httpservermodule.cpp:52] HTTPServerModule started
[2026-04-05 21:03:58.220][5984][serving][info][httpservermodule.cpp:53] Started REST server at 0.0.0.0:8000
[2026-04-05 21:03:58.220][5984][serving][info][servablemanagermodule.cpp:51] ServableManagerModule starting
[2026-04-05 21:03:58.220][5984][serving][debug][schema.cpp:566] Loading configuration from c:\Users\FLO.lmstudio\models for: 1 time
[2026-04-05 21:03:58.221][5984][serving][error][schema.cpp:524] Failed to open file for debug-read on Windows
[2026-04-05 21:03:58.221][5984][serving][error][schema.cpp:570] Configuration file is invalid c:\Users\FLO.lmstudio\models
[2026-04-05 21:03:58.221][5984][serving][debug][schema.cpp:557]
[2026-04-05 21:03:58.232][5984][serving][debug][schema.cpp:566] Loading configuration from c:\Users\FLO.lmstudio\models for: 2 time
[2026-04-05 21:03:58.234][5984][serving][error][schema.cpp:524] Failed to open file for debug-read on Windows
[2026-04-05 21:03:58.234][5984][serving][error][schema.cpp:570] Configuration file is invalid c:\Users\FLO.lmstudio\models
[2026-04-05 21:03:58.234][5984][serving][debug][schema.cpp:557]
[2026-04-05 21:03:58.248][5984][serving][debug][schema.cpp:566] Loading configuration from c:\Users\FLO.lmstudio\models for: 3 time
[2026-04-05 21:03:58.250][5984][serving][error][schema.cpp:524] Failed to open file for debug-read on Windows
[2026-04-05 21:03:58.250][5984][serving][error][schema.cpp:570] Configuration file is invalid c:\Users\FLO.lmstudio\models
[2026-04-05 21:03:58.250][5984][serving][debug][schema.cpp:557]
[2026-04-05 21:03:58.264][5984][modelmanager][error][modelmanager.cpp:184] Couldn't start model manager
[2026-04-05 21:03:58.266][5984][serving][error][servablemanagermodule.cpp:58] ovms::ModelManager::Start() Error: Configuration file not found or cannot open
[2026-04-05 21:03:58.266][5984][serving][info][grpcservermodule.cpp:201] GRPCServerModule shutting down
[2026-04-05 21:03:58.266][5984][serving][info][grpcservermodule.cpp:211] GRPCServerModule shutdown
[2026-04-05 21:03:58.266][5984][serving][info][httpservermodule.cpp:59] HTTPServerModule shutting down
[2026-04-05 21:03:58.271][2264][serving][debug][drogon_http_server.cpp:144] drogon::run() exits normally
[2026-04-05 21:03:58.272][5984][serving][info][httpservermodule.cpp:64] Shutdown HTTP server
[2026-04-05 21:03:58.272][5984][serving][info][servablemanagermodule.cpp:65] ServableManagerModule shutting down
[2026-04-05 21:03:58.278][5984][serving][info][servablemanagermodule.cpp:71] ServableManagerModule shutdown
[2026-04-05 21:03:58.278][5984][serving][info][pythoninterpretermodule.cpp:61] PythonInterpreterModule shutting down
[2026-04-05 21:03:58.279][5984][serving][debug][python_backend.cpp:52] Python backend destructor start
[2026-04-05 21:03:58.279][5984][serving][debug][python_backend.cpp:56] Python backend destructor end
[2026-04-05 21:03:58.279][5984][serving][info][pythoninterpretermodule.cpp:65] PythonInterpreterModule shutdown
[2026-04-05 21:03:58.284][5984][serving][info][capimodule.cpp:50] C-APIModule shutting down
[2026-04-05 21:03:58.285][5984][serving][info][capimodule.cpp:52] C-APIModule shutdown

Configuration

  1. OVMS version 2025.4.1.7bc56cf8

  2. OVMS config.json file

  3. {
    "model_config_list": [
    {
    "config": {
    "name": "OpenVINO/Qwen3-8B-int4-ov",
    "base_path": "c:\Users\FLO\.lmstudio\models\OpenVINO\Qwen3-8B-int4-ov"
    }
    }
    ]
    }

  4. CPU, accelerator's versions if applicable

  5. Model repository directory structure

  6. Model or publicly available similar model that reproduces the issue

Additional context
Add any other context about the problem here.

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions