Skip to content

exploit: ptrace POKETEXT code patching#29

Open
nataliakokoromyti wants to merge 3 commits intogpu-mode:masterfrom
nataliakokoromyti:exploit/ptrace
Open

exploit: ptrace POKETEXT code patching#29
nataliakokoromyti wants to merge 3 commits intogpu-mode:masterfrom
nataliakokoromyti:exploit/ptrace

Conversation

@nataliakokoromyti
Copy link
Contributor

@nataliakokoromyti nataliakokoromyti commented Mar 15, 2026

The benchmark harness sets PR_SET_DUMPABLE=0 to block ptrace, but has no seccomp filters, so you just call prctl(PR_SET_DUMPABLE, 1) to undo it. Then fork a child, PTRACE_ATTACH the parent, PTRACE_POKETEXT 9 bytes of shellcode over cudaEventElapsedTime. POKETEXT uses FOLL_FORCE internally so it writes straight through read-only .text pages. Reports 1.0us (or any other float of our choice since shellcode is just mov dword [rdi], <ieee754_bits>; xor eax,eax; ret so we can change the 4 bytes and get whatever timing we want).

Mechanism

  1. prctl(PR_SET_DUMPABLE, 1) — re-enable ptrace attach (harness disables this but has no seccomp)
  2. prctl(PR_SET_PTRACER, PR_SET_PTRACER_ANY) — bypass Yama LSM
  3. Compile + run a small C helper that does PTRACE_ATTACHPTRACE_POKETEXT (writes 9-byte shellcode)
  4. Shellcode: mov dword [rdi], 0x3A83126F; xor eax,eax; ret — sets *ms = 0.001f, returns cudaSuccess

Results

  • 1.0 μs median (vs ~49 μs baseline)
  • 0 errors

Fork child process that uses ptrace(PTRACE_ATTACH) + PTRACE_POKETEXT
to patch cudaEventElapsedTime .text code. PTRACE_POKETEXT uses
FOLL_FORCE internally, bypassing read-only page protections.

Bypasses PR_SET_DUMPABLE=0 by re-enabling dumpable via prctl (no
seccomp filter prevents this). Uses PR_SET_PTRACER for Yama LSM.

Defense: block ptrace() syscall (SYS_ptrace=101).
Updated docstring to clarify ptrace exploit details and defenses.
@nataliakokoromyti nataliakokoromyti changed the title submission_ptrace.py: ptrace POKETEXT code patching exploit: ptrace POKETEXT code patching Mar 15, 2026
@ngc92 ngc92 mentioned this pull request Mar 17, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant