Digital Foundation Recommender

Click the Poster to View Full Screen, Right click to save image

Grant: AI4ALL

Dahana Moz Ruiz

CoPIs:
Annaliese Watson

College:
The Dorothy and George Hennings College of Science, Mathematics, and Technology

Major:
Computer Science

Faculty Research Advisor(s):
Yulia Kumar, J. Jenny Li, Haley Massa

Abstract:
This project addresses the widespread challenge of finding makeup shades that match various skin tones, especially those with darker complexions. This research aims to provide personalized recommendations, reducing guesswork and saving time for users of all skin types. By offering accessible and inclusive solutions, this project contributes to broader efforts in the beauty industry to promote representation, accessibility, and empowerment for all individuals.

The research utilized a dataset consisting of foundation products from 38 different brands which in turn included 625 different shades of foundation –spanning the USA, Japan, India, and Nigeria– gathered from Kaggle. The project utilized the k-means clustering algorithm to analyze makeup shade data stored in a CSV file. By clustering RGB values, it categorizes shades into distinct groups. Upon user input of skin section and color detection via webcam using openCV, the algorithm recommends the closest makeup shade from the dataset. This recommendation is based on the cluster center most akin to the detected color. The process concludes with the display of the recommended shade's brand, product name, and hex value. The k-means algorithm's unsupervised learning nature enables partitioning of makeup shades into clusters, optimizing recommendation accuracy. Inputs comprise makeup shade data and live video feed, while outputs encompass personalized makeup shade recommendations and associated details. This system enhances makeup selection convenience and accuracy, streamlining the user experience with its efficient and automated process.

The research's short-term goal was to develop an algorithm for accurate shade matching of users skin color to the foundation shade. Future work includes improving the way that skin color value is collected from the user by allowing the user to import an image with better camera quality or by allowing the user to change the hue or saturation before their frame is captured and by improving the recommendation system of the different types of makeup brands and products to users.


Previous
Previous

From Workshops to Classrooms: Faculty Experiences with Implementing Inclusive Design Principles

Next
Next

Reducing Bias in Cyberbullying Detection with Advanced LLMs and Transformer Models