← Back to all categories

Sandboxing & Isolation

1 resource

Defenses & Mitigations

Tool sandboxing, execution containment, and process isolation

paper reviewed open access 2024

Securing LLM Systems Against Prompt Injection

Yupei Liu, Yuqi Jia, Runpeng Geng + 2 more — arXiv preprint

Proposes defense mechanisms against prompt injection in LLM systems including isolation-based approaches, input/output filtering, and detection methods.