logo

Show HN: Execute local LLM prompts in remote SSH shell sessions

Posted by smudgy3746 |2 hours ago |2 comments

abhi-3 2 hours ago[1 more]

How does it handle large outputs? If I pipe a big config file through, does the full content travel back to local for LLM processing, or is there any streaming/chunking?