‘The public has a profound interest in having that technology controlled by a public charity legally bound to prioritize safety,’ the brief says.
Encode, an artificial intelligence (AI) advocacy group, filed a brief in support of Elon Musk’s recent lawsuit against OpenAI, arguing that enabling the conversion toward a for-profit entity could endanger public interest.
Musk filed a lawsuit against OpenAI last month, arguing that the entity was formed on promises that it would retain its nonprofit status focused on safe AI use. Musk said he invested in the project based on this premise. Disrupting the status quo “will seriously harm plaintiffs and the public at large,” the complaint said.
In the brief, Encode is described as “a youth-led organization advocating for safe and responsible artificial intelligence (AI)” with “a network of over 1,000 volunteers across 40 countries.”
On Dec. 27, Encode filed a proposed amicus curiae brief with the U.S. District Court for the Northern District of California, Oakland Division, supporting Musk’s motion for a preliminary injunction against the transition.
“If the world truly is at the cusp of a new age of artificial general intelligence (AGI), then the public has a profound interest in having that technology controlled by a public charity legally bound to prioritize safety and the public benefit rather than an organization focused on generating financial returns for a few privileged investors,” the brief said.
OpenAI CEO Sam Altman has admitted that AI poses severe risks to humanity, Encode said. Altman signed a statement along with numerous luminaries, including Nobel Prize winners, saying that “mitigating the risk of extinction from AI should be a global priority.”
People worldwide are already facing challenges from AI technologies including disinformation, algorithmic bias, labor displacement, and democratic erosion, which makes keeping AI safe a “pressing, immediate concern,” the advocacy group said.
OpenAI currently runs a capped-profit subsidiary that is fully controlled by the OpenAI nonprofit parent company, which is expected to ensure the safe use of AGI.
In Delaware, where OpenAI is incorporated, the boards of nonprofit charitable corporations owe fiduciary duties toward their beneficiaries, which in this case would be “humanity,” Encode said.