Skip to content

yeonsumia/palmistry

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

46 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Fortune On Your Hand: View-Invariant Machine Palmistry

Summary

Our Palmistry principal lines detection software is implemented by 4 steps below. Our main challenge was to read the principal lines on a palm regardless of the view direction and illumination:

  1. Warping a tilted palm image
  2. Detecting principal lines on a palm
  3. Classifying the lines
  4. Measuring the length of each line

model_architecture

For palm image rectification, we used MediaPipe to extract interest points and implemented warping with the points. For principal line detection, we built a deep learning model and trained the model with palm image dataset. For line classification, we used K-means clustering to allocate each pixel to specific line. For length measurement, we set a threshold for each principal line with the landmarks obtained by MediaPipe.

Environment

The codes are written based on Python 3.7.6. These are the requirements for running the codes:

  • torch
  • torchvision
  • scikit-image
  • opencv-python
  • pillow-heif
  • mediapipe

In order to install the requirements, run pip install -r ./code/requirements.txt.

Run

  1. Before running the codes, a palm image for input(.heic or .jpg) should be prepared in the ./code/inputs directory. We provided four sample inputs.
  2. Run read_palm.py by the command below. After running the code, result files will be saved in the ./code/results directory.
> python ./code/read_palm.py --input [filename].[jpg, heic]

Results

standard

tilted

About

Fortune On Your Hand: View-Invariant Machine Palmistry

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors