According to correspondences Apple Inc. made with the developer of the ChatGPT email app, the company has postponed the approval of an upgrade that uses AI-powered language capabilities due to worries that it would produce content that is not acceptable for children. The software designer rejects Apple’s choice.
The argument demonstrates the widespread skepticism regarding the suitability of language-generating artificial intelligence techniques, such as ChatGPT.
Apple Unsure Of ChatGPT
According to Ben Volach, co-founder of BlueMail developer Blix Inc., Apple took action last week to stop an upgrade to the email software BlueMail due to worries that a new AI function in the app may display improper information. The newest ChatGPT chatbot from OpenAI is used by BlueMail’s new AI functionality to aid automate email composition utilizing the contents of previous emails and calendar events.
ChatGPT enables people to have conversations with an AI that appears to be human-like and is capable of generating advanced long-form content on a range of subjects. According to the documents, the app review team recommended that BlueMail raise its minimum age requirement to 17 and older or implement content filtering because the app may produce content that is not suitable for all audiences.
It has content-filtering capabilities, according to Mr. Volach. The age limit for using the app is presently set at 4 years old and up. For categories of apps that may include everything from foul language to sexual content and references to drugs, Apple sets an age restriction of 17 years old. Mr. Volach claims that this request is unreasonable and that Apple customers already have access to other applications with similar AI capabilities and no age limits.
Apple is looking into Blix’s complaint, according to a spokeswoman, and developers can appeal a decision made by the App Review Board. With the help of ChatGPT, an OpenAI chatbot, so-called generative AI has become one of the most keenly studied emerging technologies in recent years.