Introduction
Emojis are ideograms used in electronic texts and web pages that were first introduced by Japanese artist Shigetaka Kurita around the late 1900s. Since emojis convey facial expressions and common objects like earth, weather, and food relatively well, they became increasingly popular. Face emojis are everywhere these days because they come in handy when we are trying to communicate with people more comfortably. One can decorate their Instagram, Snapchat or other social media posts with emoji stickers. The Basic idea of my project is to find what face emojis are used in an image.
The project aims to train a classifier to learn various types of face emojis and identify them in an image. For example, a person might add a face emoji on his or her image in order to decorate it. When the image is processed, the program should be able to find the face emojis he or she used in the image. After locating them, the program should return what the face emoji is, such as, Crying face or Sleepy face. Classifying the emojis is accomplished via the Classify function where the training data is an association of emoji-names and emoji-images from Apple, Google, and Twitter. Locating the emojis is accomplished via SelectComponents and a masking workflow using ColorDistance and Filling Transform.
Running the Face Emoji Classifier
In order to identify the emojis in the sub-images, I had to make a solid custom face emoji classifier. Since I had issues with web-scraping face emoji images, I had to manually. I used 3 different images from Apple, Google,and Twitter, for each type of face emoji.
Small Part of the Dataset, 'faceEmojiNames'
Small Part of the Dataset, 'faceEmojiLists'
Using that dataset, I trained the classifier on different face emoji images with Logistic Regression and the "ImageFeatures" FeatureExtractor.
faceEmojiAT = AssociationThread[faceEmojiNames -> faceEmojiLists]
emojiClassification =
Classify[faceEmojiAT, FeatureExtractor -> "ImageFeatures"]
Finding Face Emojis in an Image
Given that most of the face emojis have a shape of a circle and a dominant color of yellow, I ran ColorDistance with yellow on the example image which returned a gray-scale image where darker shades are closer to the target color, yellow.
Example Image
stepOne =
ColorDistance[exampleImage, Yellow]
I, then, binarized the gray scale image from ColorDistance using the function Binarize. It created a binary image with black and white where black is below the set threshold and white is above.
stepTwo = Binarize[stepOne, {0, 0.3}
After taking the step of Binarize, I used FillingTransform, simplifying the features of the image by filling the entire circle with white. This gave me a completely black image with several white circles detected during the process.
stepThree = FillingTransform[stepTwo]
Next, I used SelectComponents to find features with circularity greater than .9 in the example image and applied ComponentMeasurements which returned the position of those circles.
stepFour = SelectComponents[stepThree, #Circularity > .9 &]
stepFive =
ComponentMeasurements[stepFour, {"Centroid", "EquivalentDiskRadius"},
All, "ComponentPropertyAssociation"]
Later, Disk function became useful. It created a disk out of centroid and equivalent disk radius properties, helping me to find the exact coordinates of the circles positions.
stepSix = Disk[#Centroid, #EquivalentDiskRadius] & /@ stepFive
In order to return the corners of the subimages that contain circles from these coordinate values, I wrote a function called emojiC. Using emojiC, I picked up the coordinate values of the corners of the sub-images and used ImageTrim to trim the input image using the coordinates.
emojiC[{{x_, y_}, r_}] := {{x - r, y - r}, {x - r, y + r}, {x + r,
y - r}, {x + r, y + r}}
stepSeven =
emojiC[Extract[Part[Values[stepSix], #], {{1}, {2}}]] & /@
Range[1, Length[Values[stepSix]]]
stepEight = ImageTrim[exampleImage, #] & /@ stepSeven
The issue of this process was that there were some sub-images of a circle that dont contain face emojis. Given that in most cases the sub-image containing the emoji will have the largest dimensions, I went through the area of each sub-images and drop those that are smaller than 1% of the largest sub-image using Select.
areasOfPixels[{x_, y_}] := (x*y)
stepNine =
Select[stepEight, (areasOfPixels[ImageDimensions[#]]) >
0.01 (Last[
Sort[Map[areasOfPixels, Map[ImageDimensions, stepEight]]]]) &]
Lastly, I applied custom face emoji classifier trained on different face emoji images with a method of Image Feature Extractor & Logistic Regression, and ran on the subimages containing the face emojis. Those face emojis with different colors such as red, green, blue, and purple, will also be easily detected by using their dominant colors.
stepTen = emojiClassification[stepNine]
Extension
I took these face emojis with yellowish color as an example, but those face emojis with different colors such as red, green, blue, and purple, will also be easily detected by using their own dominant colors.
Example of Face Emojis with Different Colors
Since I detected the face emojis on the images using the circular shape of those, I experienced difficulty in creating masks for those face emojis that are not circular. In order to expand on my work, I have to find out a way to detect those face emojis with external features, such as horns, glasses, and hearts. I believe this concept would be achieved by refining or finding the properties and values for the function SelectComponents instead of circularity.
Project Materials & Computational Essay
Mentor: Faizon Zaman