Data Story
Interactive Culture
This project addresses two SA based challenges, Digital Culture and Physical Culture. Consequently our project is made of two parts. The first is a website that does neural style transfer to convert current day image back in time to colonist era with the image style of your choice. The second is interactive exhibit that will match your face with the face of a colonist.
Digital Culture - Neural Style Transfer
This project used both the old colonists photographs dataset from the State library of South Australia and the South Australian Government Photographic Collection from the History Trust of South Australia. To preprocess the data we used a facial detection and recognition system to automatically detect people in photographs and crop them out to an appropriate size.
https://data.sa.gov.au/data/dataset/south-australian-government-photographic-collection
https://data.sa.gov.au/data/dataset/old-colonists-photographs
Physical Culture - Colonist Face Matching
This project is a physical interactive exhibit that uses a camera and machine learning to show you which Adelaide colonist you most look like. Simply stand in front of the camera and a similar looking colonist will immediately be shown on the screen.
The past doesn't feel that far away when you know there was someone that looked like you walking around.
--Colonists Portrait Images
https://data.sa.gov.au/data/dataset/old-colonists-photographs
We found this dataset on the state library that had just over 1000 old colonist portraits. The dataset consisted of csv files that contained info about each photo and a download URL. There were separate csv for men and women.
We used python and pandas to load the csv files and join them together in one big dataframe. We iterated over each row in the pandas dataframe and downloaded each colonist image from the web. The images were saved with the record id as the name so that data association could be made later if needed.
--FaceNet Features
https://pypi.org/project/facenet-pytorch/
https://github.com/timesler/facenet-pytorch
FaceNet is a open source project that uses a deep convolution neural network to convert an image of a face into a feature vector. The feature vectors are 512 elements long and encode all the unique features about a face. The training algorithm that was used ensures that every person gets a unique feature vector for their face. To check if two faces are similar you can simply compute the L2 distance between two vectors.
--Colonist Portrait Face Features
To quickly match faces we used a python script and FaceNet to precompute a feature vector for every colonist image. The feature vectors were saved in a python dictionary that is indexed by the record id.
--Real Time Predictions
We use python and OpenCV to capture images live from the webcam and use FaceNet to compute a feature vector for every frame that has a face in it. We then compute the L2 distance between this feature vector and every feature vector from the colonist portraits. The colonist that has the smallest L2 distance has the most similar face and shown to screen.