In May 2024, Meta announced a major decision. It will begin using public data from Facebook, Instagram, and WhatsApp users. This will be to train its AI models starting June 26, 2024. This decision highlights the importance of using lots of data to improve AI.
As of today, and due to the controversy surrounding this news, it has decided to pause the training of its AI models in Europe, models that are already available in other parts of the world. It has made this decision following a request from its main regulatory agent, the Data Protection Commission of Ireland (DPC).
The change in policy
Meta has raised concerns among users, who fear that their personal data will be used inappropriately. It is essential that users take steps to protect their privacy online. One way to do this is to review and adjust the privacy settings on their social media accounts.
It is also important to be aware of what type of information is shared online and with whom it is shared. In addition, it is advisable to use strong passwords and change them regularly to avoid possible hacking.
It’s important for people to know how to protect their online privacy. Meta’s new policy allows for public content to be used on its platforms to improve AI technologies. This includes posts, photos, comments, and other publicly available data. Importantly, private messages and posts restricted to friends and family will not be included in this data set.
The company has said that this change will improve the performance and accuracy of its AI models. These models are important to its services and features overall. However, this approach has not been without criticism.
Privacy concerns
The announcement has sparked significant pushback, particularly around issues of user consent and data privacy. Many users are concerned about the extent to which their publicly shared information will be used and the potential implications for their privacy. The process for opting out of this data use has also been a point of contention.
Privacy guide
It is not known at this time how this controversy will evolve, but for users who prefer not to have their data used in this way, Meta has provided an opt-out mechanism. However, the process, although not complicated, is quite hidden within the platform.
Guía de privacidad
De momento no se sabe cómo evolucionará está polémica, pero para los usuarios que prefieren que sus datos no se utilicen de esta manera, Meta ha proporcionado un mecanismo de exclusión voluntaria. Sin embargo, el proceso pese a no ser complicado está bastante escondido dentro de la plataforma.
At FJ Digital we wanted to make it easy and visible to provide a step-by-step guide so that you can manage your privacy before June 26, 2024:
- Visit Meta’s data settings page: Log in to your Facebook or Instagram account and navigate to the settings page dedicated to data privacy.
- Access the opt-out form: Find the specific section related to AI data usage. Meta has made this form available, but users have reported that it may be somewhat hidden or difficult to find.
- Complete the form: Provide the necessary details as requested. This may include specifying the types of data you do not want used and explaining your reasons for opting out.
- Submit the form: After completing the form, submit it and wait for confirmation from Meta that your request has been processed. Remember that opting out will only affect your personal data and not necessarily any images or content shared by friends who mention you.
- Monitor your settings: Periodically review your privacy settings and Meta updates to ensure your preferences are respected and that no new policies have overridden your choices.
Looking ahead
Meta’s new policy on using public user data to train AI marks a turning point in data use practices.
The decision to move forward with AI technology highlights the importance of discussing the cost of using “free” platforms, as well as the need to protect personal data.
As users, it’s critical to stay informed about how our data is being used and take proactive steps to protect our privacy.