Skip to content

junehyeok129/AISystem-2402

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

39 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AISystem-2402

This repository provides implementations of both few-shot and zero-shot anomaly detection using CLIP embeddings.

Project Structure

AISystem-2402
├── fewshot-clip/
├── zeroshot-clip/
├── .gitignore
├── LICENSE
└── README.md

Dataset Structure

Training Dataset

└──── ./train
    ├──── {class_name}/
    |   ├──── 0000.jpg
    |   ├──── ...
    |   └──── 0000.jpg
    ├──── {class_name}/
    |   ├──── 0000.jpg
    |   ├──── ...
    |   └──── 0000.jpg

Test (Validation) Dataset

└──── ./test
    ├──── anomaly/
    |   ├──── 0000.jpg
    |   ├──── ...
    |   └──── 0000.jpg
    ├──── normal/
    |   ├──── 0000.jpg
    |   ├──── ...
    |   └──── 0000.jpg
    

The dataset for anomaly detection is organized to facilitate both few-shot and zero-shot approaches with CLIP embeddings.

Dataset Download

Dataset can be downloaded here.(Last updated on November 24, 2024)

Dataset Organization

For Preliminary Competition (for personal practice)

  • Training Data: 6 classes, 5 normal images per class.
  • Validation Data: 6 classes, 200 anomaly and 200 normal images for each class.

For Main Competition (for project)

  • Training Data: 10 classes (excluding the 6 classes from the preliminary competition), 5 normal images per class.
  • Validation Data: No validation dataset will be given.

Each class within the anomaly dataset contains at least one example of an anomaly, such as dots, cuts, or other class-specific defects like below. This design encourages exploration of anomaly detection within constrained data conditions.

Image 1 Image 2

Getting Started

  1. Clone the Repository
git clone https://github.com/PiLab-CAU/AISystem-2402.git
  1. Set up the Conda Environment
conda create -n cauclip python=3.xx

The Recommended Python version is 3.8 or above 😉

conda activate cauclip
  1. Navigate to the Cloned Folder
cd AISystem-2402
  1. Install Dependencies
pip install -r requirements.txt

Once the dependencies are installed, we are ready to detect anomalies! 😀

Run Sample Code

Fewshot-CLIP

To run the sample code for Few-Shot CLIP Embedding, execute:

python fewshot-clip/main.py

Zeroshot-CLIP

To run the sample code for Zero-Shot CLIP Embedding, execute:

python zeroshot-clip/main.py

Performance Summary

The Performance of sample codes is shown in the tables below.

Few-Shot CLIP

Metric Value
Train Loss (Epoch 1) 0.7279
Train Accuracy (Epoch 1) 52.86%
Test Accuracy (Epoch 1) 63.08%
Best Model Path saved_models/best_model.pth (Accuracy: 63.08%)
Overall Test Accuracy 32.31%
Total Time Taken 18.42 seconds
Average Time per Image 0.2834 seconds

Inference Details

Class Correct Total Accuracy Normal Similarity Anomaly Similarity
Normal 21 21 100.00% - -
Anomaly 0 44 0.00% - -

Zero-Shot CLIP

Metric Value
Total Images 65
Correct Predictions 35
Overall Accuracy 53.85%

Class-Specific Performance

Class Total Correct Incorrect Accuracy Avg Anomaly Score Avg Normal Similarity Avg Anomaly Similarity
Normal 21 21 0 100.00% 0.288 0.921 0.634
Anomaly 44 14 30 31.82% 0.234 0.862 0.627

Detailed Metrics

Metric Value
Precision 100.00%
Recall 31.82%
F1 Score 48.28%

Metrics saved in ./results/metrics_{datetime}_{time}.json. Results can be found in the ./results directory.

Further Information

For additional details on each module, check out the specific README files:

License

License: MIT

This project is licensed under the MIT License - see the LICENSE file for details.

Contact

For questions or further information, please contact leessoit@gmail.com or use the issue tab to report any problems or suggestions.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Python 99.7%
  • Dockerfile 0.3%