Skip to content

Releases: editor-code-assistant/eca

0.125.0

12 Apr 23:46
5e79e47

Choose a tag to compare

  • Refresh auth token before each LLM API call, preventing stale tokens during long-running tool calls.
  • Add background shell command support via background parameter on shell_command tool and new bg_job tool for managing long-running processes. #77
  • Add disabled field to MCP server notifications, allowing clients to distinguish between stopped and config-disabled servers. #403

0.124.5

09 Apr 16:29
bd779d5

Choose a tag to compare

  • Fix Github Enterprise models fetch. #402

0.124.4

09 Apr 13:42
5c4f53d

Choose a tag to compare

  • Fix github-copilot retry loops in chats.
  • Fix Github Enterprise to use the proper url during prompts. #402

0.124.3

08 Apr 18:55
99993f4

Choose a tag to compare

  • Fix /resume broken for OpenAI chats: handle nil reasoning text during replay, preserve prompt-id after chat replacement, and clear UI before replaying messages. #400
  • Add GitHub Enterprise support for Copilot authentication via auth.url and auth.clientId provider config. #402
  • Add chat/update notification for renaming chats. Chat titles are now persisted to the database and broadcast to all connected clients including remote web interface.

0.124.2

07 Apr 17:41
b290604

Choose a tag to compare

  • Fix OpenAI Responses API tool calls not executing when streaming response returns empty output, and fix spurious retries caused by stale tool-call state with Copilot encrypted IDs. #398

0.124.1

07 Apr 14:28
16c62d3

Choose a tag to compare

  • Add cacheRetention provider config for Anthropic to support 1-hour prompt cache TTL. Set to "long" for sessions with pauses longer than 5 minutes.

0.124.0

07 Apr 12:24
a58d50f

Choose a tag to compare

  • Add chatRetentionDays config to control chat and cache cleanup retention period, default changed from 7 to 14 days. Set to 0 to disable cleanup. #393
  • Preserve full chat history across compactions using tombstone markers instead of replacing messages. #394
  • Add message flags — named checkpoints for resuming and forking chats. #395
  • Fix OpenAI models getting stuck at toolCallPrepare when streaming response returns empty output in response.completed. #398

0.123.3

04 Apr 20:28
172e6d5

Choose a tag to compare

  • Fix exceptions on openai responses models when creating tasks.
  • Fix potential infinite auto-compact loop when context overflow persists after compaction. #391
  • Improve Anthropic prompt caching: split system prompt into static/dynamic blocks, add cache markers to the tools array, and memoize static instructions per chat.

0.123.2

01 Apr 21:47
f1e610c

Choose a tag to compare

  • Wait for pending MCP tool list refresh before reading tools after tool execution, fixing race where dynamically loaded tools were not immediately available.

0.123.1

01 Apr 18:58
3521b5a

Choose a tag to compare

  • Fix OAuth HTTPS server crash in native image by building SSLContext in-memory instead of relying on ring-jetty's keystore path reflection.