This is a project for IBM's BlueHack 2017.
- Challenge: #VideoComprehensionSolutions
- Team Name: Connected Cognition
- Location: #Yorktown / #Poughkeepsie
- Members:
- Vaisakhi Mishra - [email protected]
- Charlotte Wright - [email protected]
- Khoi-Nguyen Mac - [email protected]
- Pat Pataranutaporn - [email protected]
- German Abrevaya - [email protected]
- Hack Dash: https://hackdash.org/projects/5970c0c97a30a4526a1fe9a7
- Video Link: https://drive.google.com/open?id=0B_INX3BifvJoUV92SGNFN0Z2VGs
An emotion analysis based real time movie scene modifying video player. The current prototype would enhance audience experience of any horror movie or video by manipulating effects in the video according to the facial mood gestures of the audience.
YOUTUBE LINK:https://youtu.be/-YdCxjJUEQQ
Link to Presentation: https://youtu.be/-YdCxjJUEQQ
Link to 3 minute pitch video: https://drive.google.com/open?id=0B_INX3BifvJoUV92SGNFN0Z2VGs
-Demo: https://www.youtube.com/watch?v=GNbztukrhgo -Demo 2: https://youtu.be/Y3HMVxHbdFw
GitHub: https://github.com/knmac/scarifier
While the film is playing, the emotion of the watcher is recognized by the Watson Facial Recognition API and categorized as either a negative or positive emotions. The movie player uses this data to change to content of the video. The film is pre-examined with the Watson Visual Recognition API to tag the content of the video to improve the quality of the changes. It uses the augmentation library to change the mood of the film using music and visual effects. The final product is a video which atmosphere changes based on the user data and content of the video.
This graph tracks the appearance of the candidate (the scary monster in this case). It uses the Watson Recognition API to analyze the content on the screen to make it easier to pin point where the editor could quickly increase the fear factor.