Bug reports

Anonymous

Be descriptive and include any steps that you have made/taken. Screenshots also help! For billing/account related questions or sensitive matters, please contact us directly at support@novelcrafter.com
Bug Report: Web App's "AI Connection" Fails to Parse Valid Response from Self-Hosted Ollama via Ngrok Tunnel
Dear NovelCrafter Support Team, I am writing to report a potential bug regarding the "AI Connection" feature within the NovelCrafter web application. Despite successfully establishing a connection between the web app and my locally hosted Ollama API via an ngrok tunnel, the interface fails to recognize the valid model list returned by the API, incorrectly displaying "No models found." Below is a detailed account of my setup and testing process for your investigation. Objective To configure the NovelCrafter Web App to use a large language model (LLM) running locally on my MacBook Air M2 (Apple Silicon) via Ollama, using ngrok to create a secure tunnel for the connection. Local Environment Setup & Verification (SUCCESS) Software: Ollama is installed and functioning correctly. Model: The llama3 model was pulled successfully and operates as expected through the local command line (ollama run llama3). Local API Test: I verified the native Ollama API is responsive by calling curl http://localhost:11434/api/tags from my terminal, which returned a valid JSON list of available models. This confirmed Ollama was running correctly on its default port. Attempted Direct Connection (EXPECTED FAILURE) Initially, I tried configuring the AI Connection in NovelCrafter with my local network IP address ( http://192.168.1.104:11434 ). As anticipated, this failed due to browser-enforced security restrictions: CORS (Cross-Origin Resource Sharing): Requests from https://app.novelcrafter.com to my local HTTP address were blocked. Mixed Content: The browser prevented a secure (HTTPS) page from accessing an insecure (HTTP) resource. Ngrok Tunnel Setup (SUCCESS) To resolve the network issues, I employed ngrok to create a secure proxy tunnel. Installation & Auth: I installed ngrok and authenticated it successfully using ngrok config add-authtoken [MY_AUTHTOKEN]. Tunnel Creation: I started a tunnel to my local Ollama instance using the command ngrok http 11434. Ngrok provided a public, secure HTTPS URL for the tunnel (e.g., https://7faa653cd726.ngrok-free.app ). Critical Validation: Testing the Tunnel (SUCCESS) To isolate the problem, I bypassed NovelCrafter and tested the ngrok tunnel endpoint directly from my browser. Test 1 - OpenAI-Compatible Endpoint: Accessing https://7faa653cd726.ngrok-free.app/v1/models returned a perfectly valid JSON response: json {"object":"list","data":[{"id":"llama3:latest","object":"model","created":1757076014,"owned_by":"library"}]} Test 2 - Native Ollama Endpoint: Accessing https://7faa653cd726.ngrok-free.app/api/tags also returned a valid response, listing the model in a different format. Conclusion: These tests prove conclusively that: The ngrok tunnel is functioning correctly. My local Ollama server is receiving requests and responding appropriately. The Ollama server is configured to allow cross-origin requests (via the OLLAMA_ORIGINS="*" environment variable). The OpenAI-compatible API endpoint (/v1/models) is active and providing the expected response format. The Bug: NovelCrafter Web App Behavior With the tunnel verified, I configured the NovelCrafter AI Connection: Connection Type: Ollama (also tested OpenAI/Custom). Base URL: https://7faa653cd726.ngrok-free.app/v1 Result: The "Supported Models" feature consistently returns "No models found". Anomaly: Leaving the Base URL field blank sometimes causes the interface to briefly display the llama3:latest model found on localhost , but it remains unselectable. Summary and Request The complete technical stack is operational—Ollama, the network tunnel, and the API endpoints are all working and accessible from the public internet. The evidence strongly suggests that the issue lies within the NovelCrafter web client's logic for parsing the successful response from the /v1/models endpoint when it is routed through such a tunnel. Could you please investigate why the web client fails to recognize the valid model list from my self-hosted setup? I believe this is a valuable use case for users who prioritize privacy and wish to use local models. Thank you for your time and assistance. Best regards, Tommy Chan
0
Unable to Open Multiple Scenes from Codex Mentions Tab in Write Interface
After opening a scene in Write via the Codex Mentions tab, attempting to open a different scene using the same method does nothing. Navigation is only restored after switching to Plan view. Reporter's Severity assessment: 3, where 1=SHOW STOPPER and 5=TRIVIAL Steps to Reproduce: Open a project in Novelcrafter. Navigate to the Codex and select any codex entry. Click on the Mentions tab. From the list, click the arrow beside a scene to open it in the Write interface. - The selected scene opens successfully. Without leaving the Write interface, return to the Codex Mentions tab. Click the arrow for a different scene in the Mentions list. - Nothing happens; the scene does not change within the write interface. Observed Behavior: Only the first scene selected via the Mentions tab loads into Write. Subsequent attempts from the same interface do not work until you manually switch away from Write (e.g., to Plan). Expected Behavior: Clicking the arrow for any scene in the Mentions tab should open that scene immediately in the Write interface, regardless of which scene was previously loaded. Workaround: Switch to Plan view before trying to open another scene from the Mentions tab; this resets navigation and restores the ability to select scenes. Have not tested other workarounds, but likely any switch away from Write will work the same way. Screen captures: Two provided. 1) First capture shows the interface after clicking the "open" arrow associated with the first set of occurrences. Note that the write interface opened correctly into the relevant scene. 2) The second capture shows the "open" button associated with the second scene where the Mentions are present. Pressing this "open" button does nothing. Comments: I see there may have been previous reports of this issue, but they did not contain steps to reproduce or workarounds. Therefore I've included those items in my defect report in the hopes the information will help developers.
0
Load More