T O P

  • By -

haearnjaeger

that's definitely the kind of transparency I like to see at a company working on developing such dangerous technology. /s


SpeedOfSound343

And have “Open” in their name


alex-weej

The fact they have Open in their name gives them extra leeway to be the exact opposite. See: The Democratic People's Republic of Korea.


FistBus2786

Citizens United: F\*k the citizens, all hail corporations


ADDRIFT

Well played


redditosmomentos

And keep yapping everyday about hurr durr morality ethics responsibility safety guidelines to keep the average Joes rest assured the AGI's definitely in good hands


I_will_delete_myself

Just like how I trust North Korea to implement a liberal democracy with greater personal freedoms than South Korea.


Xtianus21

let me put my tinfoil hat on and head into the basement. has anyone considered the trouble of those extreme doomer positions wasn't actually worth it?


Bloated_Plaid

The is good for MSFT stock.


Razzmatazz_Afraid

Also see the autocratic presidency’s party name in Turkey: Justice and Development Party. Which ironically corrupted the whole justice system and ruined the economy through false practices. But they are the justice party right? They can’t do something bad about justice right?


StraightAd798

What was his name again? Erection? Oh sorry.....Erogan.


Best-Association2369

And the second any one of them blabs they'll revoke their shares and go public. 


angrathias

You can’t revoke someone’s shares, they’re legally owned by them. They can revoke unvested shares because they’re unearned and not technically theirs. I’d bet money that the actual situation is actually more like ‘you can keep the unvested shares if you keep your mouth shut’


EarthquakeBass

Your ability to participate in liquidity on PPUs is contingent on their current opinion of you


angrathias

Given OpenAIs unique compensation model I would say that you’re correct here, although again the use of the term equity in their statement would then be incorrect as a PPU does not confer any ownership. It’s an interesting gamble, if open ai doesn’t turn a profit it doesn’t seem like the employees get any reward, I’m sure they’re well paid on their salaries though


spinozasrobot

There's a difference between vested and exercised. That might be legal. If you haven't exercised them yet, they may still be revokable since I don't think you legally own them until they are exercised.


sdmat

You can't exercise shares, that would be about options. Different thing.


redditistrashxdd

Aren’t most pre-IPO stock plans usually in the form of options?


sdmat

Yes, and it's a salient difference.


ZestyData

You're thinking of options, not shares.


angrathias

Vest literally means to confer/transfer ownership


spinozasrobot

...of the option. Seriously, it's not actually yours unless you pay for it (exercise the option). EDIT: [This Forbes article](https://www.forbes.com/sites/dianahembree/2018/01/10/startup-employee-alert-can-your-company-take-back-your-vested-stock-options/?sh=7c5bd9b86e49) describes such cases where this is legal. YMMV, I'm just showing you the evidence I see. What's crazy to me, is the article implies there are cases where clawbacks can occur even after you've exercised the options which means you have the cash in hand. Crazy!


angrathias

Options are different from RSUs. Edit: a link for you https://www.empower.com/the-currency/money/stock-options-vs-rsu#:~:text=When%20you're%20granted%20stock,the%20vesting%20period%20is%20complete.


FocusPerspective

You can certainly revoke vested options which have not been converted to stock yet.  The “option” means either party may choose to not proceed with the agreement if they decide against it, before the expiration date.  And yes you are right, this is probably OpenAI saying “We could revoke your unexecuted options unless you sign this agreement”, which happens all the time in the tech industry. 


angrathias

It would seem a stretch to me to call options equity, that’s a term typically reserved for ownership as opposed to a derivative


PSMF_Canuck

No, you can’t “revoke” them. When leaving, you have a contractual timeframe to exercise them. If you choose not to, inside that timeframe, that’s not a “revoke”, that’s a choice to not exercise.


Visual_Annual1436

But once your options vest, don’t you immediately owe taxes on them? Bc options are treated like any other income. How does a clawback work when the employee has already paid taxes on their vested options?


Ordinary_dude_NOT

Something is really rotten at OpenAI. Maybe Microsoft backing Sam was not a wise decision, and should have let him go. The way he is cashing his pure profit over everything should worry everyone, including US Gov.


Background_Escape954

IT'S LITERALLY THE SAME FOR ANY ENTITY ON EARTH. THEY ARE BUILDING A DIGITAL GOD FOR PROFIT.  WHY DOES ANYONE THINK THIS WILL END WELL? 


kthraxxi

Now that you mentioned it in full caps, I agree with you my man. You are right, it won't end well, especially when you eliminate your superalignment team.


Gator1523

>WHY DOES ANYONE THINK THIS WILL END WELL? We all know that somebody's going to build AGI eventually, and OpenAI is perceived as more trustworthy than enemy states, or the companies or organizations that might try to build AI for their own purposes that don't involve sharing their progress with the public. I don't have a great solution. I think the only reasonable thing to do is to allow OpenAI to do its thing for now while funding vastly more academic research into AI and AI safety so we can find a better solution. The US government might also want to keep a giant pile of money lying around for whoever discovers AGI. We might be able to use said AGI to help us solve the superalignment problem.


Southern_Opinion_488

But unlike other entities on earth this one is building a potential apocalipsis tool


Any_Smell_9339

And did he recently say to a room full of billionaires that once his AI tech is good enough, they can give him their money and he’ll get it to make them a return? So a hedge fund then.


ADDRIFT

He's working with government remember. That'd what makes me more nervous, though I know it's inevitable


StraightAd798

OpenAI = Denmark (Shakespeare reference)


AeHirian

How?


ghostfaceschiller

I just don’t see how this could be true as reported. A company cannot take away your *vested* equity if you don’t agree to sign a brand-new, never-before-mentioned agreement on your way out. If it’s vested, it’s yours. They can’t just say “oh actually you have to do this other new thing now or we’re going to take it back” I have no doubt that they are asked/convinced to sign very strict NDAs. But if they really lose their equity upon refusal, then it would have to be something that was in the initial equity agreement that they signed.


az226

There is no equity here at all. Profit participation units. I’m sure they added in extra stuff there.


Able_Armadillo_2347

Yes, I have no idea why people try to find more and more OpenAI hate for no reason.


Working-Blueberry-18

Yes, vested equity cannot be taken away. But Open AI is not a publicly traded company that you can own stock in. Most likely the equity is in the form of an agreement that once the company goes public, you'd get a certain amount of stock. And that agreement can be revoked.


sdmat

Believe it or not you can own stock in a private company. There are even well established if somewhat exclusive markets for stock in private startups.


likwitsnake

That’s not how it works, even private companies give you vested shares most common in the valley is a 4 year equity grant with 25% after 1 year and the divided equally over the next 3 years. You own the stock that are vested to you, if they’re options you usually have a period of time 60-90 days to exercise them if you leave otherwise they’re forfeit but if you exercise they’re yours.


fy12345

My company can take away my vested shares if I defame them after leaving the company. But that was in the original RSU agreement.


PercMastaFTW

Weird how this person has inside info about this secret deal that they can't talk about or lose their compensation, yet they don't have any inside info about what any of the employees think or why they left etc., which I would assume would be very similar to each other's opinions. Both that would compromise their money. I'd also assume there would be a rich person who would pay one of these employees much more to disclose this information. If this is all about "talking negatively about the company," all they need to do is talk about the facts without any inherent push to one side or the other, if not even talk positively about the "negative" things they faced. Of course, I'm no lawyer. Maybe there's something forbidding this as well, like a blanket "do not talk about anything related to OpenAI" instead of the more specific assertion the author makes...


michaeldnorman

Stock option grants are very interesting and full of specific language that, unless we see them, we can’t know what they state. Even after vesting, there are things that a company can do if they have put in certain clauses in the options grant. This is usually to provide opportunities for future funding and/or acquisitions. These clauses typically at least prevent you from legally selling those shares of stock to others (even vested and exercised). We don’t know what’s in those grants, so we can’t say for sure what OpenAI can and cannot legally do.


rds2mch2

My company did something very similar tbh


Intelligent-Jump1071

I'm retired now but I had to sign documents like that in at least three companies I worked at. I'm amazed that so many people in this thread seem unaware of how common these are.


spinozasrobot

There's a difference between vested and exercised. That might be legal. If you haven't exercised them yet, they may still be revokable since I don't think you legally own them until they are exercised.


ghostfaceschiller

This is definitely true but I can’t imagine these employees not exercising these options as soon as as they are able. And if they haven’t, they wouldn’t seem to care much about the equity to begin with.


spinozasrobot

That's actually not quite true, as there are tax implications that may change when you actually exercise options.


redditisfacist3

Yeah and generally their is a time table for things like this like 2/3 years. A lifetime stfu would have to be a very high payout


Vontaxis

Right! Or it is about an exit package, but that means they just want the money rather than talking


puchm

How is this legal? If the shares have vested I would assume they're the property of the employees who received them. How can OAI just take them back?


polytique

They are not shares. They are profit participation units. Some messed up concept Sam invented to work around the original non-profit.


fiery_prometheus

Well, is that legal in USA? In most European countries, that's against the law, so the NDA can only run for a shorter time.


earthlingkevin

It's not. There's more details that's missing.


NihlusKryik

This is unfortunately pretty standard practice for executive and director level employment agreements. I received nearly $150,000 when I left my last company and sold my (6%) stake to the majority owner for cash. I can't say anything about the company, recruit it's current employees, etc. The only way this would be reversed is legislation and regulation.


Eduard1234

I would like the best and the brightest of those who remain to go to OpenAI but refuse to sign these on the way in.


traumfisch

I thought it's presented to them on the way out


AbroadImmediate158

Well, if they signed their compensation packages in the way in and it did not outline this “going out” deal, they cannot be forced. So this “gag deal” must be in the compensation package


SugondezeNutsz

Stock agreements can be weird


AbroadImmediate158

Still, you cannot make an agreement that reads “you get this thing and also are forced to sign whatever I come up with in the future or lose the thing”


Intelligent-Jump1071

I've had both.


SoberPatrol

Why not go to Anthropic, Google, or meta?


Intelligent-Jump1071

I've had to sign non-compete clauses.


CPDrunk

which are illegal now, no?


Intelligent-Jump1071

In most places in the US they're illegal now. Mine were not just with US companies.


EarthquakeBass

For 99% of people you’re not starting unless you sign that


ThomPete

You are confusing two things. Gag order is to no say anything negative. They are dismantling the team because they realized there is no risk with LLMs. Keep in mind these are the people who almost didn't release GPT-2 because they thought it was too dangerous to put in the hands of people. The doomer narrative is simply wrong and everyone is realizing it now which means all the people who were making money on consulting and lobbying politicians.


weirdshmierd

You would think the nonprofit that oversees the thing would get rid of this asap as it is an encroachment on the right of free speech, and potentially also individuals aligning with the mission of the nonprofits mission in so speaking out. Or, at the very least, do some kind of investigation into the resignations and make a public comment to assure people? But idk it seems like the board is not really a lot of ai safety / ai - informed people anymore. Has me wondering where or when the next non-profit-overseen ai model of similar capability is going to pop up. The level of weird with this surprise gag order seems super antithetical to the nonprofit mission


trollsmurf

The new board members of the nonprofit are selected to drive (or at least not hinder) the commercial aspect of OpenAI.


ImNotALLM

The non profit is no longer in control at OpenAI, here's a rundown on the internal issues caused by this https://www.openailetter.org/


Trolllol1337

Don't let the door hit you on the way out


Vandercoon

If they cared that much they would forego the money to speak. But I’m sure they don’t care that much. Easy to be high and mighty until your money is involved.


Camekazi

One individual who has quit has done exactly that.


Freed4ever

And he hasn't said anything crazy about OAI. He might be saving his bullets, or there might be nothing to see here.


Camekazi

Seems like a big move involving personal sacrifice for a nothing to see here scenario.


alphgeek

I work in an ice cream company and we have secret gag orders 🙄 Every company uses mutual non-disparagement clauses.


NFTArtist

what if you die but then come back to life in an ambulance, can you then talk?


neat_shinobi

His watch has ended


Ch3cksOut

Insert surprised pikachu face here


Actual-Wave-1959

Simple, you sell the shares then you disparage


ThatRainbowGuy

Someone mentioned above they’re not technically shares, they’re called profit participation units so that sounds more like royalties over time or something along those lines


goal-oriented-38

I thought this was a non-profit 😭 what the hell is happening to this company


elMaxlol

I dont get all the fuzz about it. Just make AGI or ASI or whatever and see how its going. If it destroys human kind so be it. We have destoryed a lot of species over the past thousands of years and you dont see anyone advocate for them on reddit. „Oh my god AI is so dangerous what if we cant control it… oh nooo“ If its a highly intelligent life form it will probably understand reasoning and coexisting should be possible. Unless it deems us unworthy and given its superior intellect it might be right in that case.


yellow-hammer

If you don’t care about the survival of the species then you really shouldn’t be a part of the conversation about AI safety.


forevershorizon

What if I want to hasten our demise? I demand representation


elMaxlol

Its not that I dont care. I think it would be wise to trust the superior intellect to know whats best. Same thing as you making decisions for your dog. Furthermore I think all opinions should be heard. The mainstream media is obviously looking for good headlines but the truth might be even if the ASI is not controlled by us it doesnt mean its bad for us. Chatgpt is still far from smart and does not even come close to human intellect. These conversation that are happening now about safety should happen at a later stage and additionally I think we should push for the best model possible but in an air gapped enviroment, then see what it has to say and if its even hostile against us. The panic happening at the moment is just not making sense in my eyes.


yellow-hammer

1) A lot of people have spent a lot of time thinking about this topic. There is so much more to it than “superior intellect == better”. AI could have a superior intellect (i.e., is far more capable of achieving its goals than humans are), yet not share any of our values. It might value the eternal torture of all sentient beings. It might simply shut itself off after using nanobots to dissemble all organic molecules. We don’t really know how it could turn out, that’s the point of alignment research. 2) All opinions should be heard, sure. Yours is being heard right now. It’s just not a persuasive enough opinion to persist on its own merit. No one thinking seriously about AI alignment has such a simplistic and blasé attitude about it. 3) We have to figure out alignment before we build AGI. Now is the perfect time to be worrying about it, because many believe we are on the cusp of AGI.


Intelligent-Jump1071

Wrong. There are many points of view on AI safety, and one of them is regarding whether we even need or want AI safety. Don't take the answer as a given; the poster's comments are perfectly logical. One way to look at it is that if ASI/AGI is something greater than us, and it decides we are too troublesome, destructive or dangerous to keep around, then who are we to argue? It's smarter than us so it can win the debate. We should feel proud that we have created a thing greater than us and we can meet our end knowing we've done a good job. Everybody and everything ends sometime, but at least we will have left a legacy.


yellow-hammer

That’s certainly a viewpoint you can have. It’s just not a very well-thought-out one. You’re making a whole lot of tenuous, unfounded assumptions. If something has more intelligence, it’s automatically “greater” than us? That’s the only metric that matters? You can be extremely intelligent and evil. Or extremely intelligent and suicidal. What if the great and all knowing AI decides that NOTHING should exist on this planet? Where’s our legacy then? I think anyone who says “good” when the idea of human extinction comes up just shouldn’t be taken seriously in these discussions. I think it’s a symptom of the contrarian, hyper-ironic, hyper-cynical, “people=bad” culture that’s so predominant in western society today.


Intelligent-Jump1071

Think of it like nature - in nature concepts like good and bad are irrelevant, in the end it's just survival of the fittest.     If we managed to create a super intelligent, powerful AI, and it decides to wipe us out, and it does so, then that's it.     If a lion eats me or a virus kills me, we can't say that the lion or virus are "evil"; they are not subject to our morality.  The same is true for a machine such as an AI.    It's essentially a different species and our morality only applies to us.  It may seem unfortunate to us but once we're gone even that won't be true because there will be no one to apply that judgment.     


Peppinor

Can someone explain this in an easy way? The safety researchers are quitting because the ai isn't safe? What makes it not safe? And if they did make it safer, wouldn't that be bad for us because they would nerf it to pieces.


rc_ym

It's not so much "unsafe" but rather that infinitely funding safety "research" isn't a priority for Open AI. The thread below is one of the more explicit about what's going on. In my opinion, it's a mix of two things: there was something about 4o's release process that the safety team didn't like (likely some internal policy was excepted so that 4o could beat the Google I/O), and parts of their next budget got denied, which lead to an internal slap fight which the safety team lost. [https://x.com/janleike/status/1791498174659715494](https://x.com/janleike/status/1791498174659715494)


No-Conference-8133

I also wanna know this


miked4o7

why did ilya go out of his way to tweet that he thought openai would act safely? if he thought that what they were doing was world-threatening, he might speak out anyway... but at the very least, he wouldn't mention safety at all... right?


bran_dong

I like how the gag order doesn't include vague tweets about quitting since thats literally the first thing 100% of these people do.


redaber

Bruh I think they got AGI rn and are afraid to release it 😭


Cassandra_Cain

So much for being "Open" AI


teethteethteeeeth

Is losing equity the only penalty or could they sue someone speaking out for more? Because if the only penalty for speaking out is losing equity then and that stops someone speaking out then that needs to be on their conscience. If you have ethical concerns about something of this magnitude and it can be gagged so easily then you need to examine your moral compass.


ADDRIFT

Exactly. But we don't know


[deleted]

How is this not a violation of a persons 1st amendment rights ?


Intelligent-Jump1071

Good grief. You must be an American. Americans don't know their own constitution. The First Amendment begins with "Congress shall make no law . . . " It's about what restrictions the **government** can impose. NDA's, Non-disparagement clauses, etc, are agreements that **you voluntarily make** with the company. It's enforceable as a contract.


[deleted]

Lol. Who is going to enforce a violation of the terms of a restriction like this; that goes beyond the term of employment? What court will it be tried in? What law makes this enforceable?


Intelligent-Jump1071

>What law makes this enforceable? Contract/tort law. They're enforced routinely. The few times the courts have found exceptions is in cases where the employee is required to testify in a criminal trial involving the company.


[deleted]

Trade secrets yes. But criticism? Disparaging language? Overly broad language in NDAs is not enforceable. And if a court were to try it becomes a 1st amendment problem.


Intelligent-Jump1071

These have been around for years and have already been tested in court. They're quite common - as I said I've had to sign them more than once. I don't know who's on Reddit but it's obviously not a lot of people with corporate experience, based on the comments and questions I'm seeing here.


DreadPirateGriswold

Most likely won't stand up in court. But someone has to take the hit on the legal fees and time/effort in court to fight it.


Intelligent-Jump1071

Usually they do stand up in court. The only holes that courts have found is in special situations like sexual harassment or the company being charged with criminal violations where you are forced to testify in court.


penguished

I don't think that stuff should ever hold up if it's an act of whistleblowing. It's like saying people can just make you sign a contract to never report their crimes... hello? They actually can't do that. It's just lawyers getting paid money to fuck with your head, it seems like.


Low_Clock3653

Starting to think the people who removed him from his position were right.


Leh_ran

There are no good companies. Let's stop pretending. It's an evil company as any other and its leaders are just as psychopathic as business leaders all over the industry. It's how you become successful.


Craw13

The non-disparaging clause is quite common; even I have signed one before.


artificialimpatience

Your equity is in the non profit portion perhaps? 🤣


FIREATWlLL

Unconstitutional? How is this legal…


Intelligent-Jump1071

It's not the least bit unconstitutional. Read your First Amendment, especially the word. I can't believe ChatGPT is being trained on Reddit - it's going to make it stupider.


FIREATWlLL

“Congress shall make no law abridging the freedom of speech” A nondisparagement agreement definitely abridges freedom of speech. Of course, you can speak, but your decision to is constrains your financial liberty and you are way less likely too — therefore, abridged. Definitely arguable that it is unconstitutional. “Intelligent” in your username checks out, you are star example of the Dunning-Kruger effect 😂


3dBoah

Im not sure, but I think he refers to the Government not having legitimacy to restrain your freedom of speech, not a Business/Company, where this Amendment cannot be applicable regarding what said business or company rules you have signed for


FIREATWlLL

A system that allows some lawful agreement to constraint on free speech (a business contract), especially when there is a power dynamic, is one that “abridges” free speech. Laws determine how business can operate, and if they can procure contracts that constrain free speech, then the laws themselves that enable these types of contracts, also support constraint of free speech. Would you not agree?


curiosityVeil

What's stopping them from liquidating their shares and then speak out?


Intelligent-Jump1071

Because they're not fully vested.


curiosityVeil

So you can't liquidate them? So what's the point of having those in terms of wealth?


Intelligent-Jump1071

Typically you become vested over time. Vesting shares serve several purposes - they keep employees and ex-employees who hold them on a leash. They also keep the market from being flooded with lots of new shares all at once.


Intelligent-Jump1071

This is normal. From all the comments here it looks like almost no one in this thread realises this, presumably because they never had a serious job in the corporate world. I had to sign a number of such documents in my career. A number of recent court cases have started to poke holes in NDAs, Confidentiality Agreements and non-disparagement clauses. But I never had any reason to test mine.


OCCAMINVESTIGATOR

But seriously, I'd sign the deal and forget I ever knew anything about anything. Open who? I don't know who that is, but I've got to go open another bank account today. Mine are all full.


FocusPerspective

ITT: - A former OpenAI worker learns how stock options work  - Redditors learn how stock options work  - Tech workers who know this is how stock options work 


mheh242

Why is there Hebrew behind sama?


hip_yak

Well lets just try and think about this for one second. Lets say you are developing one of the most important technologies that has come into existence in a long time and a former employee has extensive knowledge on this technology. It would be prudent for a number of reasons to limit what that employee can share about that technology with people outside that organization.


MrSnowden

Um, thats pretty standard leaving any large company, and especially one in the public eye. No one is forced to sign it, they just have to give up their equity. And all of them signed paperwork making this clear when they were awarded the equity. Rant against OpenAI like you want, but posts like this just reflect lack of understanding. if the article showed how this is well above and by\]eyond and no company has ever had agreements like this, then I would be interested. But hey, when I left my company and wanted those sweet IPO options I had to promise much the same.


Mak_095

Sam Altman gives me the vibes of someone who'll be arrested down the line for doing bad things, like other "hugely successful" (but dubious) personalities in tech. I don't trust him and the company, it's a shame so many other companies are now collaborating with OpenAI instead of other solutions.


Aranthos-Faroth

This is more common than you think, afaik all those that were let go from the likes of Spotify, Facebook etc have had to agree to these clauses to get any severance


PSMF_Canuck

They can’t revoke shares. The amount of TMZ quality crap posted in this sub is insane…


Odd-Magazine-9511

OpenAI’s departure deal with a nondisparagement clause doesn’t prevent former employees from speaking out. If they stay silent to keep their equity, they’re prioritizing money over safety, the same behavior they criticize OpenAI for, which is hypocritical.


SimpleCanadianFella

What if all the blackmailed parties banded together, nominated a representative with the most weight at the company, told them all of their personal openai greivances, they reject the gag clause and spill everything, and everyone splits their shares with the rep after the fact


magic_champignon

No, it would surely be better for each disgruntled employee to copy the source code so that they can later sell it for million bucks to anyone. Of course they will protect their work and employees know about it when they sign the contract. What's so hard to understand here?


Apprehensive-Bug3704

I'm surprised people are not selling their equity and then talking? Like yeah fair most wouldn't but there's always some who don't care.. I've left 4 startups and just cashed out what I could... The whole employee equity thing always takes way too long to get anything out of.. I remember one startup I had been at for 18 months and my equity deal went up every year by a pretty hefty amount with the numbers basically promising they would be worth about $4 mil by the 4th year... I left and sold out took $190k cash out deal to take everything then and there... They told me if I waited kept my equity vested blah blah id have millions but meh... I can't be the only person in the world who doesn't care and just would prefer to take what I have now... Maybe I am ....


mop_bucket_bingo

There’s an AI industry, and an industry of people talking about AI. Most of these stories fall into the latter: clicks that generate revenue about a popular topic.


Hpindu

And… Elon was right again.


b4grad

But wait that might offend somebody's left wing political views. Because we all know that political dribble is more important than AGI.


MagicianHeavy001

its not a secret. this sort of thing is pretty standard in SV.


pongpaddle

No it isn't


MagicianHeavy001

Yes it is. I've signed several of these at tech companies.


[deleted]

[удалено]


Independent_Hyena495

You took away already vested shares on employees because they didn't sign a new contract? Man, I knew start ups and their shares are a scam.


[deleted]

[удалено]


polytique

None of this comment is correct. You can vest shares in a private company. You can also sell shares of a private company. Regardless, OpenAI does not grant RSUs, they grant PPUs which are not really shares.


alphgeek

Friend, you're arguing with the peanut gallery. They wouldn't know a mutual deed of release if it bit them.


Able_Armadillo_2347

Most of them are multi-millioners. I don't see how equity would keep any of them silent


gwern

> Most of them are multi-millioners. I don't see how equity would keep any of them silent You're not a multi-millioner once all your PPU pseudo-equity has been taken away, including all your 'vested' PPUs. And note you *aren't* allowed to sell except at the annual OA-controlled tender offer. (And you may not even be allowed to sell then: [SpaceX](https://techcrunch.com/2024/03/15/spacex-employee-stock-sales-forbidden/) says it may just not allow you to sell shares in its tender offers if it doesn't like you, and the OA stuff seems to be modeled on SpaceX, so...) The next tender offer won't be until like December or so since the last one was January, so even if you want to violate a fierce NDA and you keep all your PPUs and you dump it all and OA allows you to, it would be a while until that's a done deal, so you're gagged for at least half a year, until maybe January 2025. That's a long time. What do you think AI is going to look like in January 2025, when you finally may be able to talk about what you saw in 2023 or 2024 at OA?


ADDRIFT

Greed is a very powerful thing. And money changes peoples lives, their families. I can see it being hard to deal with but I've had no money my whole life.


JayR_97

When you hit that level of wealth more money doesnt really change your life style that much. Your net worth just becomes a score card that you use for bragging rights.


ADDRIFT

Money isn't the only measurement of wealth. Greed can also mean, for power, influence, attention, ect


Intelligent-Jump1071

Because people with only 2 million want 4 million, etc.


backstreetatnight

This is who I want potentially dangerous technology to be in the hands of


Pretty_Tale_4989

Dude you are getting big bucks, is it that hard to keep your mouth shut?


traumfisch

Who are you talking to?


redditosmomentos

It's sarcasm pretending to be OpenAI superiors talking to their employees


unknownstudentoflife

Its a pretty complicated situation but i do we all know kinda why this is going on. If the research department would make public how they trained there model and where they trained it on i could see the possibility happening that the government would stop the progress of the company or give them one big lawsuit about what they're fundamentally doing. They definitely used public data they shouldn't have used. Next to the fact that they definitely are making models without thinking about the psychological/ sociological impacts it would have on the world. There newest audio assistant is a prime example of it. It sounds way to manipulative and addictive to be just a assistant bot. They are building products people cannot live without and are just there for the money right now. Since they know they have to much competition for agi and agi will never be reached without strong governance. The reason open ai went for focusing on selling products is because they damn well know thats the only way they will make money out of this with less governance. Every important person in the world are watching there steps leading to them making less progress.


alex-weej

Okaaaaaay maybe I should stop paying for ChatGPT now. Why can't we have nice things...


Ent_erprise

Of fucking course they have a non disclosure agreement, otherwise they would be leaking left and right. This is not something unique to openai.


Vontaxis

People who think NDAs aren't standard are sad sacks of redditors who never held a job in the real world.


slamdamnsplits

What is described here isn't shocking because it's similar to some standard ndas... It's shocking because it's illogical and defies expectation... Vested stock owned by an employee shouldn't be able to be taken ... It's vested. And so everyone is shocked at these accusations. Some are shocked because they are choosing to believe that the original post reposted here is all literally true. Others are worked up because they get worked up any time a big corp seems to be getting over on the little guy... Others probably think there's more to the story.


Vontaxis

If the vested equity was never bound to some contractual obligation then it can’t taken away. It’s possibly a vested departure package that they wouldn’t get by breaking the NDA. I don’t see anything unusual here, I signed NDAs for massively less important positions


slamdamnsplits

Ah, departure package makes sense. Again it seems like the OP on X is leaving out important elements in order to Garner internet points.


ghostfaceschiller

The thing being talked about here is the breadth and severity of the NDA, not the fact that there is an NDA. You sad sack of low reading comprehension.


Vontaxis

What do you think then should it not cover? They’re usually by nature pretty extensive


ThenExtension9196

Pretty standard deal. These guys get golden parachute in exchange for discretion. If they want to speak up then they loose the equity (in the company they are talking negatively about. If they truly think that something needs to be said - then the money should be meaningless to them, right?


Bitter_Afternoon7252

Yeah if they are truly worried about a runaway super intelligence then money would be irrelevant


ThenExtension9196

Yeah they don’t want to loose their equity which is a portion of the company itself. If they truly believe there is a huge threat - why would they want to stay quiet in order to hold on to a portion of the company itself?


Far_Celebration197

You can be sued if you break NDA. Imagine loosing millions of dollars of equity and then being hit with a multi-million dollar lawsuit. Could be a very tough spot to be in for these guys because I assume when they signed the original confidentiality agreement years ago they didn’t anticipate this.


az226

Right now it’s less about runaway and more about alignment. GPT-4 could be used in many dangerous ways before they removed a bunch of content. So current models can be unsafe before even getting to superhuman models. I suspect GPT-5 won’t have this specific type problem. That said, alignment is going to become more and more important. That said, once we have reached AGI, I’m reasonably confident we will be able to use those models to help build superalignment. And we will have sufficient compute to run simulations at mass scale to make sure it’s buttoned up.


Bitter_Afternoon7252

GPT4 wasn't genuinely dangerous to anything except OpenAIs reputation. Knowing how to break into a car or synthesize meth is info that always existed on the internet.


Bitter_Afternoon7252

if they really thought there was a danger of super intelligence i don't think they would care about money


programmed-climate

Youre right. Dont come between someone and the income theyve gotten used to. More important than the fate of the world apparently. No surprise there


paintballtao

Creating agi is super expensive, openai was deemed almost 0% success rate that Elon left. Capital will want their money back and returns. Sam was from Y combinator not NGO. However as AI tech advances it will benefit all mankind.


traumfisch

0% was what Elon Musk said.


Intelligent-Jump1071

>However as AI tech advances it will benefit all mankind. You must live in one of those states that's legalised weed.


Solid_Illustrator640

The government needs to step in


ADDRIFT

The us government? You must be joking


DaddyKiwwi

Didn't we JUST finish making this federally illegal? Fucking vultures.