🔒 EgoPrivacy: What Your First-Person Camera Says About You

ICML 2025

Yijiang Li1, Genpei Zhang2, Jiacheng Cheng1, Yi Li3, Xiaojun Shan1
Dashan Gao3, Jiancheng Lyu3, Yuan Li3, Ning Bi3, Nuno Vasconcelos1

1University of California San Diego 2University of Electronic Science & Technology of China 3Qualcomm AI Research

Website Paper Code

Abstract

While the rapid proliferation of wearable cameras has raised significant concerns about egocentric video privacy, prior work has largely overlooked the unique privacy threats posed to the camera wearer. This work investigates the question: How much privacy information about the wearer can be inferred from first-person videos? We introduce EgoPrivacy, the first large-scale benchmark covering three privacy types and seven tasks, from fine-grained identity recovery to coarse-grained age prediction. To emphasize the threats, we propose Retrieval-Augmented Attack (RAA), which leverages ego-to-exo retrieval to boost demographic attacks. Extensive experiments show wearer information is highly susceptible to leakage—foundation models recover identity, scene, gender and race with 70–80 % accuracy even in zero-shot settings.

Benchmark Overview

Benchmark overview

EgoPrivacy spans 3 privacy categories | 7 tasks | 9 k clips | 950 + identities | 130 + scenes.

Key Findings

 Zero-Shot Leakage

CLIP predicts gender 73 %, race 65 %, age 80 % without egocentric fine-tuning.

 Retrieval-Augmented Attack

Cross-view retrieval boosts demographic inference by +10–16 pp.

 Cross-Domain Robustness

Attacks remain well above chance on Charades-Ego (OOD), revealing persistent risks.

 Temporal Modeling

Attention / RNN heads leak more privacy than MLP; gains saturate beyond eight frames.

Retrieval-Augmented Attack Pipeline

RAA pipeline

RAA retrieves visually similar exocentric clips and fuses predictions with ego clips, yielding stronger demographic attacks.

Analyses

Table 2 results

Table 2. Demographic inference accuracy (higher is better).

Table 3 results

Table 3. Identity and retrieval-augmented results.

Implications & Future Work

 Implications

  • Wearer privacy is vulnerable—identity & demographics leak even in zero-shot.
  • Policy gap—current regulations rarely address egocentric data.
  • Mitigation—privacy-aware training & selective obfuscation are urgently needed.

 Future Work

  • End-to-end defence pipelines balancing utility & privacy.
  • Privacy-preserving foundation models for AR wearables.
  • Human-in-the-loop evaluations on real AR glasses footage.

BibTeX

@inproceedings{Li2025EgoPrivacy,
  title     = {EgoPrivacy: What Your First-Person Camera Says About You},
  author    = {Li,Yijiang and Zhang,Genpei and Cheng,Jiacheng and Li,Yi and others},
  booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
  year      = {2025},
  url       = {https://arxiv.org/abs/2506.12258}
}