Meta just announced that it is updating its advertising policies to require advertisers to disclose when political, election, or social issue advertising has been digitally created or altered, including through the use of artificial intelligence. The policy goes into effect in 2024.
Under Meta's new policies, advertisers will be required disclose when social issue, election, or political advertising uses "photorealistic image or video" or "realistic sounding audio" that was digitally created or altered to:
- Depict a real person as saying or doing something that they did not say or do;
- Depict a realistic person that does not exist or a realistic looking event that did not happen, or alter footage of a real event that happened; or
- Depict a realistic event that allegedly occurred, but that is not a true image, video, or audio recording of the event.
The policy does not require disclosure, however, if the digitally created or altered content is "inconsequential or immaterial to the claim, assertion, or issue raised in the ad." This includes image size adjusting, cropping an image, color correction, or image sharpening, "unless such changes are consequential or material to the claim, assertion, or issue raised in the ad."
Meta also said that, when an advertiser discloses in the advertising flow that the content is digitally created or altered, Meta will add information about this on the advertising as well.
Google announced a similar move in September.
This alert provides general coverage of its subject area. We provide it with the understanding that Frankfurt Kurnit Klein & Selz is not engaged herein in rendering legal advice, and shall not be liable for any damages resulting from any error, inaccuracy, or omission. Our attorneys practice law only in jurisdictions in which they are properly authorized to do so. We do not seek to represent clients in other jurisdictions.