The US Federal Trade Commission is looking into OpenAI to see if its ChatGPT app causes harm to consumers by providing them with misleading statistics.
OpenAI, which is backed by Microsoft, was informed of the probe in a 20-page questionnaire, in which the firm was requested to describe events in which users were falsely disparaged and to discuss any company efforts to ensure this does not happen again.
The Washington Post was the first to report on the probe by the US regulator.
Large language models (or LLM), a type of generative AI that can churn out human-like text in mere seconds, were on full display when OpenAI released ChatGPT last November, shocking the world.
Despite widespread excitement about the possibilities of this new technology, complaints soon surfaced that the models were capable of producing insulting, misleading, or otherwise unexplainable material (often referred to as “hallucinations”).
While FTC chair Lina Khan did not specifically reference the probe, she did tell lawmakers on Wednesday that the FTC was concerned about libellous content being generated by ChatGPT.
For example, “we’ve heard about reports where people’s sensitive information is showing up in response to an inquiry from somebody else,” Khan added.
We’ve heard rumours of slander and defamation and outright lies. We’re worried about fraud and dishonesty like that,” she continued.
According to the questionnaire, the FTC is looking at OpenAI’s use of private data to develop its industry-leading model, with a primary focus on how this could hurt consumers.
ChatGPT is built on the company’s GPT-4 technology, as are dozens of other programmes from different firms that pay a licence fee to OpenAI to utilise the same concept.
If the FTC is satisfied with the target company’s response to an investigation, it may drop the matter without taking any further action.
The Federal Trade Commission (FTC) will take legal action to stop harmful or illegal practises if it finds them.
We asked OpenAI and the FTC for comment, but neither organisation got back to us.