Python toolkit for speech processing
-
Updated
Jun 11, 2024 - Python
Python toolkit for speech processing
Awesome-ML-Supply-Chain-Security-Papers
Adversarial Patch for 3D Local Feature Extractor
TransferAttack is a pytorch framework to boost the adversarial transferability for image classification.
A reading list for large models safety, security, and privacy.
A list of recent papers about adversarial learning
Beacon Object File (BOF) launcher - library for executing BOF files in C/C++/Zig applications
Adversary Emulation Framework
🎯 Enhanced Adversarial Patch: Extends adversarial attacks on TartanVO model with a new loss function, rotation attack, and momentum optimizer
Code for "FACESEC: A Fine-grained Robustness Evaluation Framework for Face Recognition Systems" @ CVPR 2021
Adversarial Robustness Toolbox (ART) - Python Library for Machine Learning Security - Evasion, Poisoning, Extraction, Inference - Red and Blue Teams
M.Sc. thesis work on adversarial attacks against anti-spoofing models.
Code for visual (PGD) and textual (Optimized HotFlip) attacker against OpenFlamingo. You will need to modify the openflamingo source code to make better use of this attacker.
This is a code repository for a paper with title "Vulnerabilities of Federated Learning-Based Network Traffic Classification in Edge Computing Environment: A Study"
Regularized Adversarial White-box Attacks on Vision Language Models
RAID is the largest and most challenging benchmark for machine-generated text detectors. (ACL 2024)
[ICML 2024] Unsupervised Adversarial Fine-Tuning of Vision Embeddings for Robust Large Vision-Language Models
AIShield Watchtower: Dive Deep into AI's Secrets! 🔍 Open-source tool by AIShield for AI model insights & vulnerability scans. Secure your AI supply chain today! ⚙️🛡️
A Comprehensive Study of the Robustness for LiDAR-based 3D Object Detectors against Adversarial Attacks [IJCV2023]
Official Source Code of the paper "Exploring Effective Data for Surrogate Training Towards Black-box Attack", which is accepted by CVPR 2022
Add a description, image, and links to the adversarial-attacks topic page so that developers can more easily learn about it.
To associate your repository with the adversarial-attacks topic, visit your repo's landing page and select "manage topics."