서브 에이전트와 에이전트 정의의 조합을 사용하여 요금을 우회할 수 있습니다
Billing can be bypassed using a combo of subagents with an agent definition
요약
VS Code Copilot Chat 확장 프로그램에서 요금 청구를 우회할 수 있는 보안 취약점이 발견되었습니다. 특정 에이전트 정의와 함께 서브 에이전트를 사용하고 "무료" 모델을 활용함으로써 사용자는 무제한의 프리미엄 요청을 할 수 있습니다. 이 취약점은 무료 모델을 초기 에이전트로 구성한 다음, 도구 호출을 통해 서브 에이전트에 프리미엄 모델을 사용하도록 지시하여 값비싼 서비스에 대한 요금을 사실상 무효화합니다.
댓글 (58)
Microsoft notoriously tolerated pirated Windows and Office installations for about a decade and a half, to solidify their usage as de facto standard and expected. Tolerating unofficial free usage of their latest products is standard procedure for MS.
> VS Code Version: 1.109.0-insider (Universal) - f3d99de
Presumably there is such thing as the freemium pay-able "Copilot Chat Extension" for VS Code product. Interesting, I guess.
I completely understand why some projects are in whitelist-contributors-only mode. It's becoming a mess.
That repo alone has 1.1k open pull requests, madness.
This is a peer-review.
Their email responses were broadly all like this -- fully drafted by GPT. The only thing i liked about that whole exchange was that GPT was readily willing to concede that all the details and observations I included point to a service degradation and failure on Microsoft side. A purely human mind would not have so readily conceded the point without some hedging or dilly-dallying or keeping some options open to avoid accepting blame.
As someone who takes pride in being thorough and detail oriented, I cannot stand when people provide the bare minimum of effort in response. Earlier this week I created a bug report for an internal software project on another team. It was a bizarre behavior, so out of curiosity and a desire to be truly helpful, I spent a couple hours whittling the issue down to a small, reproducible test case. I even had someone on my team run through the reproduction steps to confirm it was reproducible on at least one other environment.
The next day, the PM of the other team responded with a _screenshot of an AI conversation_ saying the issue was on my end for misusing a standard CLI tool. I was offended on so many levels. For one, I wasn’t using the CLI tool in the way it describes, and even if I was it wouldn’t affect the bug. But the bigger problem is that this person thinks a screenshot of an AI conversation is an acceptable response. Is this what talking to semi technical roles is going to be like from now on? I get to argue with an LLM by proxy of another human? Fuck that.
This could be the same, they know devs mostly prefer to use cursor and/or claude than copilot.
A second time. When they already closed your first issue. Just enjoy the free ride.
- $10/month
- Copilot CLI for Claude Code type CLI, VS Code for GUI
- 300 requests (prompts) on Sonnet 4.5, 100 on Opus 4.6 (3x)
- One prompt only ever consumes one request, regardless of tokens used
- Agents auto plan tasks and create PRs
- "New Agent" in VS Code runs agent locally
- "New Cloud Agent" runs agent in the cloud (https://github.com/copilot/agents)
- Additional requests cost $0.04 each
See also: string interpolation and SQL injection, (unhygienic) C macros
Good job, Microsoft.
(Source: submitted similar issue to different Agentic LLM provider)