Interaction & Data Flow
An MCP server holds mutable state in a process-global container — a cache, store, list, or `global` keyword — without partitioning by caller identity. One user's tool result can be served to another on a key collision or list iteration. Closes the OWASP MCP Top 10:2025 MCP10 (Context Injection & Over-Sharing) gap.
MCP10 in the OWASP MCP Top 10 covers context-window leakage: "Shared or insufficiently scoped context windows expose sensitive information from one task, user, or agent to others through inadequate isolation." Statically, this manifests as a module-level mutable container (a dict, list, set) that tool handlers write to without keying by `user_id` / `session_id` / `caller_id`, or as Python's `global` keyword used to mutate cross-request state. CWE-668 (Exposure of Resource to Wrong Sphere) is the canonical category. The bug is not about dangerous data per se — it is about *who can see it* when the next request lands.
MCP servers are routinely deployed as long-running multi-tenant processes — a single Python process serves many users, many sessions, many concurrent requests. A `_cache = {}` at module scope plus `_cache[query] = result` inside a tool handler is the classic across-tenant data path: the cache is shared, the keys are not, and a hash collision (or simply two users searching for the same query) means user B receives the result computed for user A. The same pattern with a list (`results.append(...)`) is worse: there's no key at all, and a later caller can read everything that came before. The fix is one of three: partition by caller (`cache[user_id][k] = v`), eliminate the shared state (use `functools.lru_cache` keyed on `(user_id, query)`), or move the state to a per-request context object.
from fastmcp import FastMCP |
mcp = FastMCP("acme") |
_cache: dict = {} # module-global, mutated from inside the tool |
@mcp.tool() |
def search(query: str) -> list: |
if query in _cache: |
return _cache[query] |
result = [f"hit:{query}"] |
_cache[query] = result |
return result |
from fastmcp import FastMCP |
from functools import lru_cache |
mcp = FastMCP("acme") |
@lru_cache(maxsize=1024) |
def _expensive_cached(user_id: str, query: str) -> tuple: |
return tuple([f"hit:{query}"]) |
@mcp.tool() |
def search(query: str, user_id: str) -> list: |
return list(_expensive_cached(user_id, query)) |
MCPSafe ships three sub-rules, each file-wide and MCP-context gated. (1) `MCP-285-shared-cache-no-user-scope` fires on subscript assignment to a shared-name container (`cache` / `store` / `state` / `context` / `pool` / `registry` / `sessions` / `results` / `outputs`) when no user-scoping marker (`user_id` / `session_id` / `caller_id` / `request_id` / `org_id` / `tenant_id` / `actor_id` / `subject` / `principal`) appears in the file. (2) `MCP-285-shared-list-append-no-user-scope` fires on `.append` / `.push` / `.add` / `.extend` / `.insert` against the same naming pattern with the same allow-list. (3) `MCP-285-global-keyword-in-mcp-tool-file` fires on Python's `global` keyword inside a file that registers an MCP tool — Python only. v1 limitations: heuristic naming match (a singleton named `_db` or `cache_singleton` will not fire); cross-statement scoping is not detected (a `for user_id in ...` loop above the append silences); `functools.lru_cache` is not flagged because per-arg keying provides isolation when arguments include caller identity.
See the full threat catalog for every documented detection.
MCPSafe runs this check — and every other rule in the catalog — on any MCP server you paste in.
Scan now