Virginia Tech recently released a report outlining potential bias in the output of the artificial intelligence tool ChatGPT on environmental justice issues in different counties. In the report, Virginia Tech researchers note that ChatGPT has limitations in providing information o

Webmaster’s Home (ChinaZ.com) December 18 news: Virginia Tech in the United States recently released a report outlining the potential bias in the output of the artificial intelligence (AI) tool ChatGPT on environmental justice issues in different counties. .

In the report, Virginia Tech researchers note that ChatGPT has limitations in providing information on environmental justice issues in specific areas. The

study found a trend indicating that larger, more densely populated states have greater access to this information. “In states with more urban populations, such as Delaware or California, less than 1% of the population lives in counties that do not receive specific information.”

Meanwhile, less populous areas lack equivalent access to information. way.

"In rural states like Idaho and New Hampshire, more than 90 percent of the population lives in counties where specific local information is not available," the report states. The

report further quoted Kim, a lecturer in Virginia Tech's Department of Geography, who called for further research because biases are being uncovered.

"While more research is needed, our findings reveal that the ChatGPT model currently suffers from geographic bias," Kim declared. The

research paper also includes a map showing the range of populations in the United States for which location-specific information on environmental justice issues is not available.

This discovery comes on the heels of recent news that academics have found that ChatGPT may exhibit political bias.

Cointelegraph previously reported that a study published by researchers from the UK and Brazil stated that the text output by large language models like ChatGPT contains errors and biases that can mislead readers.