OpenAI has also changed its data retention policy, which could reassure companies considering experimenting with ChatGPT. The company said it would now only retain user data for 30 days and promised it would not use user-entered data to train its models.
According to David Foster, partner at Applied Data Science Partners, a London-based data science and AI consultancy, this will be “critical” in getting businesses to use the API.
Foster believes the fear that customers’ personal information or critical business data could be swallowed up by ChatGPT’s training models kept them from adopting the tool to date. “It shows a lot of commitment on OpenAI’s part by basically saying, ‘Look, you can use this now, without risk to your business. “You’re not going to find your business data in this general model,” he says.
This policy change means businesses can feel in control of their data, rather than having to trust a third party – OpenAI – to manage where it goes and how it is used, according to Foster. “You built this system efficiently on someone else’s architecture, in accordance with someone else’s data use policy,” he says.
This, combined with the falling price of access to large language models, means that there will likely be a proliferation of AI chatbots in the near future.
API access to ChatGPT (or more officially, what OpenAI calls GPT3.5) is 10 times cheaper than access to OpenAI’s less powerful GPT3 API, which it launched in June 2020, and who could generate compelling language when prompted but didn’t have any. the same conversational force as ChatGPT.
“It’s a lot cheaper and a lot faster,” says Alex Volkov, founder of the Targum language translator for videos, which was built unofficially from ChatGPT during a hackathon in December 2022. “It doesn’t happen usually. With the world of APIs, prices generally increase.
This could change the economics of AI for many companies and spark a new wave of innovation.
“It’s a great time to be a founder,” says QuickVid’s Habib. “Due to its low cost and ease of integration, every application will have some type of chat interface or LLM. [large language model] integration… People will have to get used to talking to AI.