Recently, a donation message released by Chengdu Aiyixing Public Welfare Service Center caused controversy: an octogenarian grandmother looked at the close-up with sincere eyes, and the picture caption read "Grandma Zhou, who is over 80 years old, takes care of the paralyzed My wife’s daughter is seriously ill again, so I need to help her raise living allowance.” In response, a blogger posted that the page was suspected of using AI to synthesize photos and was suspected of fraudulent donations.
Image source: Jimu News
In response, the customer service staff responded: The frontal photos will be AI synthesized due to privacy protection, and in the future, supplementary images generated by intelligence will be annotated. The Chengdu Civil Affairs Bureau stated that the products involved have been removed from the shelves and the organizations involved have been ordered to make rectifications. If donors are misled, refunds will be issued.
Authenticity is the basic prerequisite for charity fundraising. According to the customer service staff of the store, in the project story, some parties involved were unable to provide photos for various reasons, or did not want their photos to be exposed to the outside world. To protect their privacy, the organization would use side shots, back shots, AI mapping, empty lens use and other methods are demonstrated.
However, we should see that AI synthesizes close-up photos of the parties without marking them in obvious places, which is essentially different from side and back shots. The former is significantly different from the person's image, which violates the principle of authenticity. As some netizens said, "I would rather you clearly code and tell me to protect my privacy than see a 'well-intentioned' p-picture, not to mention it was generated by AI." Such fictional bubbles will undoubtedly increase the cost of social trust. Give the donor the negative feeling of being deceived.
Of course, some people think that if it is true as the public welfare organization involved said, except that the frontal photo of the old man was synthesized by AI, all other information is true and credible, then there is no harm in doing so. This view ignores the fact that the main fundraising image, as a visual language, has an influence that cannot be ignored. Through photos, people can intuitively understand the living environment, physical condition, family situation, etc. of the person concerned. Even their sincere eyes and warm smiles may touch people's hearts and make them decide to help.
However, these charitable deeds should be inspired by real details, rather than induced by fictional photos that look very "impactful". When personal photos can be "interpreted" by AI at will, people also have reason to question: Is there any "moisture" in other information about the person involved? Are the operations of the public welfare organizations involved sufficiently professional and standardized? Is it worthy of being trusted by donors? The credibility of public welfare activities needs to be carefully protected. Once doubts begin to spread, the foundation of social trust will be shaken. The direct result is: when people see similar help information again, they will have more concerns and hesitation. Obviously, this is unfair to those in need who really need help.
Image source: Visual China
It is understandable for public welfare organizations to seek the opinions of recipients and their families in advance and pay attention to protecting their privacy when releasing information. However, personal information cannot be fabricated, and the donor’s right to know should also be respected. According to the provisions of my country’s Charity Law: when conducting fundraising activities, the legitimate rights and interests of the fundraising targets must be respected and protected, the fundraising targets’ right to know must be protected, and the fundraising targets must not be deceived or induced to donate by means of fictitious facts or other means. Therefore, whether you use coding, side or back photos, or abstract promotional images, it is more legal and reasonable than AI synthetic images.
In fact, truth itself has the power to touch people's hearts. Instead of using fake pictures to arouse people's sympathy in order to attract attention and cross the red line of authenticity, the public welfare organizations involved may wish to make more efforts to dig out the real details. The author noticed that the text description of Grandma Zhou's situation on the relevant donation page is relatively general, such as "Grandma Zhou, a lonely old man who lives in the countryside, does not enjoy the happiness of family. Sometimes no one may say a word to her for a whole day, or even Even three simple meals a day have become a problem.”
If, on the basis of verifying the real situation and respecting her wishes, she could restore her real dilemma in more detail and clarify what kind of help Grandma Zhou needs and where the donations will be used, etc., it would also be able to arouse people's empathy and love. , and then help the seekers overcome difficulties.
In short, authenticity is the cornerstone of public welfare activities, and help-seeking information is by no means a casually "packaged" product. Adhering to public welfare norms and avoiding the abuse of AI synthesis technology should become a social consensus. Written by
/Editor by Ren Guanqing
/Huang Shuai
Source: China Youth Daily client