mlhard
ReLuess Your Inhibitions
kalmarcf
Task: black-box oracle for a 1-hidden-layer ReLU neural network (128×192), viewer.py converts weights to image. Solution: model stealing via ReLU kink analysis — find neuron activation boundaries along all-ones direction, compute 192-dim gradients via parallel finite differences (~48K queries), recover W2[j]*W1[j,:] rows from gradient differences, binarize and reconstruct pixel font image to read flag.
$ ls tags/ techniques/
model_extraction_attackrelu_kink_analysisfinite_difference_gradientparallel_oracle_queriesbinary_image_reconstructionpixel_font_ocr
🔒
Permission denied (requires tier.pro)
Sign in to access full writeups
Create a free account with GitHub to get started.
$ssh [email protected]