A little reminder: The system does not work for tweets shared by user in very old time or for those older than the last 1800 tweets the user shared 💔

I’m sick of this framing. Tired of it. Many people have tried to explain, many scholars. Listen to us. You can’t just reduce harms caused by ML to dataset bias.
ML systems are biased when data is biased.
— Yann LeCun (@ylecun) June 21, 2020
This face upsampling system makes everyone look white because the network was pretrained on FlickFaceHQ, which mainly contains white people pics.
Train the *exact* same system on a dataset from Senegal, and everyone will look African. https://t.co/jKbPyWYu4N
Follow us on Twitter
to be informed of the latest developments and updates!
Follow @tivitikothreadYou can easily use to @tivitikothread bot for create more readable thread!