Software engineer Vishnu Mohandas made a significant decision when he left Google in 2020, not only ending his employment but also choosing to stop using Google Photos. His concern arose upon learning about Google’s collaboration with the US military on AI for drone footage analysis. Fearful that his personal photos might contribute to training AI systems—potentially with unethical applications—he sought to create a safer alternative for photo storage and sharing. Thus, he founded Ente, a service that prioritizes privacy through features like open source design and end-to-end encryption.
Despite amassing over 100,000 users, many from the privacy-centric community, Mohandas found it challenging to communicate the downsides of relying on Google Photos to a broader audience. An innovative solution surfaced when an Ente intern proposed exploring the depths of Google’s AI capabilities. This led to the creation of TheySeeYourPhotos.com, a website where users can upload photos and receive detailed AI-generated descriptions derived from Google’s computer vision technology.
In a test with a family photo, Google’s analysis detailed specifics, including a model of a watch worn by his wife. However, it also raised concerns when the AI connected the watch to extremist groups, prompting the team to refine the descriptions produced to focus on more neutral content. Thus, while the analysis became less alarming, it still contained inherent assumptions about the identities and backgrounds of the individuals in the photo.
Google responded indirectly to the initiative without commenting specifically on Ente, revealing through support channels that images uploaded to Google Photos are analyzed mainly to enhance user experience and not for resale or advertising.
The potential implications of advanced AI photo analysis are considerable. Ente’s platform allows users to test the AI on various stock images to illustrate how much can be inferred from seemingly innocuous photos. Mohandas warns of the risks associated with modern practices: as users continue to accumulate digital images, they inadvertently provide vast amounts of data that could be interpreted to draw psychological profiles, leading to potential manipulations by advertisers, employers, or others in the future.
While the appeal of their alternative service lies in privacy and a user-controlled experience, it does come with its challenges—such as limitations in advanced features and potential risks associated with lost or forgotten encryption keys. Nonetheless, Mohandas is committed to keeping his family’s memories safe while advocating for caution in managing personal data on major platforms, emphasizing the unpredictability of how this data might be used down the line.