FaceApp denies storing users’ photographs without permission

The developer of a popular app which transforms users’ faces to predict how they will look as older people has insisted they are not accessing users’ photographs without permission.

FaceApp, which was launched by a Russian developer in 2017, uses artificial intelligence allowing people to see how they would look with different hair colour, eye colour or as a different gender.

The app has topped download charts again this week, after users homed in on its ageing filter, which has since been used by dozens of celebrities and prominent figures to picture how they will supposedly look in several decades’ time.

This surge of interest has in turn created concerns that FaceApp is systematically harvesting users’ images. People who upload their image to the app transfer the picture to a server controlled by the developer, with the photograph processing done remotely, rather than on their phone.

These concerns have been heightened by growing awareness of online privacy issues in recent years and the fact that the developer is based in Russia, where many high-profile online misinformation campaigns have been based, in addition to a loosely-phrased privacy policy.

FaceApp’s has previously received attention for the ethics of some of its filters. In April 2017 the app’s makers apologised for a feature that whitened people’s faces when they selected the “hot” filter, leading to accusations that it considered lighter skin to be synonymous with attractiveness. The developers said this was due to a flaw in the underlying neural network, which was skewed towards Caucasian faces.

Categories: Trends

Leave a Reply

Your email address will not be published. Required fields are marked *

Social Media Auto Publish Powered By : XYZScripts.com