What is GPT-3?
Generative Pre-trained Transformer 3 (GPT-3) is an advanced AI language model that can produce human-like text. The that it generates is often indistinguishable from that created by humans. The model has been trained with the textual material available online. The result it produces focuses on what has already been written or shared online.
The problems with GPT-3
The GPT-3 generated text can be racially biased. There have already been quite a few incidents when people have shared their results to illustrate this. Jerome Pesenti (Head of AI at Facebook) said that GPT-3 is surprising and creative, but it’s is also unsafe due to harmful biases. He further stated that ‘We need more progress on Responsible AI before putting NLG models in production.’ Prompted to write tweets from one word – Jews, black, women, holocaust – it came up with these (https://thoughts.sushant-kumar.com).
In a GPT-3 paper, OpenAI accepts that its API models have displayed algorithmic biases in the generated text. The Open AI team also warned about the GPT-3’s malicious use in spam and fraudulent actions, including deep fakes.
On the other hand, these advances in text-generating models will significantly influence the future of literature. With such language models, it can be expected that a significant portion of all written content available in the near future will be computer-generated. The “high-quality” texts created by the model generally go undistinguishable by casual readers.
However, these are not the only problems relevant to deep fakes. One line of thought is concerned about the likely “data pollution” that the GPT-3 text could cause, as stated by Andrej Karpathy (Director of AI at Tesla) in a tweet. The content produced by GPT-3 is based on the existing data on the Internet. A significant majority of the material is neither verified nor published by accountable authors. This may cause the GPT-3 model to pursue the same direction of content creation. The standard of the material is likely to fall. The debate expands to the effect of this on future generations, who will have a hard time discovering real quality work in a haystack of the created content.
GPT-3 could have a significant effect on the job market. It has already shown to generate non-trivial computer code. Sharif Schandeem, the founder of debuild.co, tweeted how their team exploited GPT-3 to build an app that can write code. This breakthrough in software development could pose a challenge to all the coders and computer engineers. Their significance in the sense of certain events in text-generators is at issue. On the other hand, the ability to generate text content may also pose a threat to the jobs of content writers, journalists, scriptwriters, etc.
With its incredible processing capacity and innovative use cases, GPT-3 is expected to attract more interest in the future. However, there are some serious societal implications associated with it. GPT-3’s current state is far from perfect and may require significant enhancements before it may be considered safe and effective to use.