Code.Jam(2020)-McGill Hackathon: and the winner is A Virtual Fitting Room
For the second consecutive year, Sama was a Terabyte partner of the McGill Engineering Hackathon, the largest annual hackathon run by the McGill Electrical, Computer, and Software Engineering Student Society.
Thank you! Your submission has been received. We'll get back to you as soon as possible.
In the meantime, we invite you to check out our free resources to help you grow your service business!
For the second consecutive year, Sama was a Terabyte partner of the McGill Engineering Hackathon, the largest annual hackathon run by the McGill Electrical, Computer, and Software Engineering Student’ Society. This as part of our close partnership with McGill University and the broader Montreal Machine learning Technology community.
In this year defined by COVID-19, the CodeJam team opted for the very fitting “Digital by Default” theme. Staying on topic, we proposed our very own challenge with an “Online Retail and Shopping Smart App”, wherein students would get to interact with a custom fashion segmentation API trained on our iMaterialist open dataset.The participation was incredible! Out of 27 teams, 9 tackled our challenge. The submissions were split into two categories:
Recommendation Engine: “I have seen someone wear this, where can I find it?”
The Virtual Fitting Room: “How would this look on me”
Most recommendation approaches involved extracting one or multiple pieces of clothing using the segmentation API and querying an image repository for similar items. The models were producing good recommendations when the piece of clothing was well defined. Results were less accurate when the quality of the source image segmentation was approximate. A number of factors such as the type of clothes, occlusion (hair, jewelry, etc.), and ambient picture attributes affect segmentation and produce an approximate product match.
My Wardrobe
Team “GradientBoys” took on the task of virtually showing clothes on a subject, a virtual fitting room of sort. They implemented a complex pipeline that involved segmenting the subject picture (person looking to try the clothes), extracting the style of clothing mask, and segmenting the article of clothing from the library. This was followed by clever usage of the OpenPose model for keypoint identification that allowed to extract a set of local and global distortions to modify the new piece of clothing to fit the subject. Completely taking the in-person shopping experience out of the equation, trying on clothes has never been this smart. Perhaps even more impressive considering it was built in less than 36 hours.Awesome work, team GradientBoys, and thank you McGill, and all of CodeJam Student Execs for organizing this amazing event. Looking forward to the years to come!