Recently, in almost every corner of the Internet, many people have been discussing the new product of Open AI. As the company points out, they trained a model called ChatGPT that interacts with users in a conversational way (in the form of a chat). The dialogue format allows ChatGPT to answer additional questions, admit mistakes, question false premises, reject inappropriate requests, etc. This product has been hailed as a revolution and a milestone in the development of artificial intelligence (AI).
This tool quickly began to be tested by software developers. It turned out that typing a command with a command to write a code snippet or ChatGPT application gave meaningful results with working code. Something that previously took more time for a programmer can suddenly be done in a few minutes.
From the business and legal side, such potential actions of programmers in commercial projects can be problematic for companies employing them. The most frequently mentioned issues include:
- wrong calculations,
- failure to provide the source of information and thus possible problems related to intellectual property and copyrights,
- incorrect answers to the questions asked,
- creating potentially reliable but in the long run incorrect code.
The examples given above mean that software companies are already introducing specific provisions in their agreements with developers that prohibit the use of AI tools to create content and programming code.
Below you will find an example of such a clause that is used on the American market:
“Content Generation Tool” means the product known as GPT-3 (Generative Pre-trained Transformer 3) and any other tool that automatically generates content using artificial intelligence technologies trained, in whole or in part, using data and material as to which third parties may assert ownership. Contractor shall comply with Company’s policy prohibiting use of Content Generation Tools in the course of Contractor’s provision of services hereunder.