Increase Ollama timeout to 1800s for long video transcripts
47-minute videos produce ~48K chars of transcript which takes >10 minutes for llama3.1:8b on CPU to process. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
parent
c06e17c016
commit
d480c635ff
|
|
@ -87,7 +87,7 @@ Identify the {settings.target_clips} best viral clips from this transcript."""
|
|||
|
||||
logger.info(f"Sending transcript to Ollama ({settings.ollama_model})...")
|
||||
|
||||
async with httpx.AsyncClient(timeout=600.0) as client:
|
||||
async with httpx.AsyncClient(timeout=1800.0) as client:
|
||||
response = await client.post(
|
||||
f"{settings.ollama_url}/api/chat",
|
||||
json={
|
||||
|
|
|
|||
Loading…
Reference in New Issue