Using image data, predict the gender and age range of an individual in Python. Test the data science model using your own image.
Predicting the apparent age and gender from a picture is a very interesting problem from a technical point of view but can also be very useful when applied to better understand consumer segments or a user base for example. It can be used to infer the age or gender of a user and use this information to make personalized products and experiences for each user.
The predicted gender may be one of ‘Male’ and ‘Female’, and the predicted age may be one of the following ranges- (0–2), (4–6), (8–12), (15–20), (25–32), (38–43), (48–53), (60–100) (8 nodes in the final softmax layer). It is very difficult to accurately guess an exact age from a single image because of factors like makeup, lighting, obstructions, and facial expressions. And so, I made this a classification problem instead of making it one of regression.
Python libraries:
- opencv
- argparse
Utilities:
To implement whole idea we need some utilities to get easy work and efficient result. Above utilities are purposely built for some task like face detection and age, smile and many other things .You may find above file from this link
opencv_face_detector.pbtxt
opencv_face_detector_uint8.pb
age_deploy.prototxt
age_net.caffemodel
gender_deploy.prototxt
gender_net.caffemodel
For face detection, we have a .pb file- this is a protobuf file (protocol buffer); it holds the graph definition and the trained weights of the model. We can use this to run the trained model. And while a .pb file holds the protobuf in binary format, one with the .pbtxt extension holds it in text format. These are TensorFlow files. For age and gender, the .prototxt files describe the network configuration and the .caffemodel file defines the internal states of the parameters of the layers.
For run this project use following command:
python detect.py — image girl1.jpg
Outputs:
That’s it about Gender and Age Detection. For Implementation visit above link to get idea about the steps.