Dear NovelCrafter Support Team,
I am writing to report a potential bug regarding the "AI Connection" feature within the NovelCrafter web application. Despite successfully establishing a connection between the web app and my locally hosted Ollama API via an ngrok tunnel, the interface fails to recognize the valid model list returned by the API, incorrectly displaying "No models found."
Below is a detailed account of my setup and testing process for your investigation.
  1. Objective
To configure the NovelCrafter Web App to use a large language model (LLM) running locally on my MacBook Air M2 (Apple Silicon) via Ollama, using ngrok to create a secure tunnel for the connection.
  1. Local Environment Setup & Verification (SUCCESS)
Software: Ollama is installed and functioning correctly.
Model: The llama3 model was pulled successfully and operates as expected through the local command line (ollama run llama3).
Local API Test: I verified the native Ollama API is responsive by calling curl http://localhost:11434/api/tags from my terminal, which returned a valid JSON list of available models. This confirmed Ollama was running correctly on its default port.
  1. Attempted Direct Connection (EXPECTED FAILURE)
Initially, I tried configuring the AI Connection in NovelCrafter with my local network IP address (http://192.168.1.104:11434). As anticipated, this failed due to browser-enforced security restrictions:
CORS (Cross-Origin Resource Sharing): Requests from https://app.novelcrafter.com to my local HTTP address were blocked.
Mixed Content: The browser prevented a secure (HTTPS) page from accessing an insecure (HTTP) resource.
  1. Ngrok Tunnel Setup (SUCCESS)
To resolve the network issues, I employed ngrok to create a secure proxy tunnel.
Installation & Auth: I installed ngrok and authenticated it successfully using ngrok config add-authtoken [MY_AUTHTOKEN].
Tunnel Creation: I started a tunnel to my local Ollama instance using the command ngrok http 11434. Ngrok provided a public, secure HTTPS URL for the tunnel (e.g., https://7faa653cd726.ngrok-free.app).
  1. Critical Validation: Testing the Tunnel (SUCCESS)
To isolate the problem, I bypassed NovelCrafter and tested the ngrok tunnel endpoint directly from my browser.
Test 1 - OpenAI-Compatible Endpoint: Accessing https://7faa653cd726.ngrok-free.app/v1/models returned a perfectly valid JSON response:
json
{"object":"list","data":[{"id":"llama3:latest","object":"model","created":1757076014,"owned_by":"library"}]}
Test 2 - Native Ollama Endpoint: Accessing https://7faa653cd726.ngrok-free.app/api/tags also returned a valid response, listing the model in a different format.
Conclusion: These tests prove conclusively that:
The ngrok tunnel is functioning correctly.
My local Ollama server is receiving requests and responding appropriately.
The Ollama server is configured to allow cross-origin requests (via the OLLAMA_ORIGINS="*" environment variable).
The OpenAI-compatible API endpoint (/v1/models) is active and providing the expected response format.
  1. The Bug: NovelCrafter Web App Behavior
With the tunnel verified, I configured the NovelCrafter AI Connection:
Connection Type: Ollama (also tested OpenAI/Custom).
Result: The "Supported Models" feature consistently returns "No models found".
Anomaly: Leaving the Base URL field blank sometimes causes the interface to briefly display the llama3:latest model found on localhost, but it remains unselectable.
  1. Summary and Request
The complete technical stack is operational—Ollama, the network tunnel, and the API endpoints are all working and accessible from the public internet. The evidence strongly suggests that the issue lies within the NovelCrafter web client's logic for parsing the successful response from the /v1/models endpoint when it is routed through such a tunnel.
Could you please investigate why the web client fails to recognize the valid model list from my self-hosted setup? I believe this is a valuable use case for users who prioritize privacy and wish to use local models.
Thank you for your time and assistance.
Best regards,
Tommy Chan