Hollywood writers don’t want to give generative AI any credit


A hot potato: The Writers Guild of America is debating how to deal with ChatGPT and other generative AI algorithms when it comes to scriptwriting. The organization is seemingly willing to allow AI-assisted works, but only if the AI will get stripped of any authorship credit.

While people are beginning to witness the plagiarism capabilities of AI algorithms, the labor union representing writers for film, television, radio, and other media industries is pondering how to properly manage this new frontier in content creation. The WGA is seemingly willing to consider AI as a legit tool in the scriptwriting process, but it doesn’t want to lose any money because of it.

According to three unnamed sources from within the movie industry, WGA’s proposal is not considering an outright ban of AI technology from writers’ work. Hollywood script and story writers would rather adopt generative AI by treating it as just a “tool,” with no practical consequences on credit or monetary compensation.

WGA is discussing the state of generative AI in its talks with the Alliance of Motion Picture and Television Producers (AMPTP), as the two organizations are working on drafting a new working contract. WGA later confirmed its proposal in a series of tweets about the regulation of “material produced using artificial intelligence.”

According to the aforementioned tweets, such regulation should ensure that movie and TV companies can’t use AI to “undermine writers’ working standards” when it comes to compensation, residuals, separated rights and credits.

WGA says AI can’t be used as “source” or “literary material” for any MBA-covered project, as these are two fundamental definitions for classifying writers’ work. Source material refers to original novels, plays or even magazine articles, which a screenplay can be based on. Literary material is the basic production of a writer’s work, which is then considered for residuals and other compensations.

AI cannot be used as source material, the WGA says, as AI software isn’t capable of creating anything on its own. ChatGPT and other machine learning algorithms are just statistical inference machines that generate “a regurgitation of what it’s fed,” the organization states.

AI is being fed both copyright-protected and public-domain content, and it has no “intelligence” or awareness whatsoever to define what’s what. Therefore AI output cannot be eligible for copyright protection, nor can AI software can sign a “certificate of authorship.” Conversely, WGA concludes, plagiarism is an integrated feature of the AI process.


Source link

Is bone health linked to brain health? — ScienceDaily

New microchip links two Nobel Prize-winning techniques — ScienceDaily