Skip to content

Instantly share code, notes, and snippets.

@acreber
acreber / README.md
Last active January 13, 2026 07:39
Code Samples for Anthropic AI Security Fellows Application - Anokhi Creber

AI Security Research Code Samples

Anthropic AI Security Fellows Program Application
Applicant: Anokhi Creber
Contact: anokhicreber@gmail.com
Date: January 2026


Overview

@acreber
acreber / face_privacy_risk_assessment.py
Created January 13, 2026 07:03
Code Sample 3: Privacy Risk Assessment for Incidental Human Face Capture Anthropic AI Security Fellows ApplicationAnokhi Creber Automated privacy risk assessment framework addressing the core challenge of visual search systems that incidentally capture human faces while searching for objects (e.g., pets).
"""
Code Sample 3: Privacy Risk Assessment for Incidental Human Face Capture
Critical challenge: Visual search systems (e.g., searching for pets) inevitably
capture human faces. How do we quantify and mitigate this privacy risk?
Research Question: Can we automatically assess privacy risk from incidentally
captured faces and make appropriate privacy-preserving decisions?
"""
import numpy as np
@acreber
acreber / membership_inference_attack.py
Created January 13, 2026 07:02
Code Sample 2: Membership Inference Attack on Visual Embeddings Anthropic AI Security Fellows Application - Anokhi Creber Implements membership inference attack (Shokri et al. 2017) to evaluate privacy leakage in visual search systems. Tests if adversaries can determine which images were processed by the system.
"""
Code Sample 2: Membership Inference Attack on Visual Embeddings
Tests if adversary can determine whether specific image was in training set.
Research Question: Do visual embeddings leak information about incidentally
captured humans in distributed search scenarios?
Based on: Shokri et al. (2017) "Membership Inference Attacks Against ML Models"
"""
@acreber
acreber / privacy_preserving_embeddings.py
Created January 13, 2026 06:59
Code Sample 1: Privacy-Preserving Visual Embeddings with Differential Privacy Anthropic AI Security Fellows Application - Anokhi Creber Research prototype demonstrating differential privacy implementation and privacy-utility tradeoff evaluation for distributed visual search systems.
"""
Code Sample 1: Privacy-Preserving Visual Embeddings with Differential Privacy
Research prototype for evaluating privacy-utility tradeoffs in visual search systems.
Research Question: What noise levels preserve privacy without destroying search utility?
"""
import torch
import torch.nn as nn
import numpy as np