LLMs are so notoriously terrible at telling truth from lies that “AI hallucination” is a household phrase at this point, for better or for worse. But surely they work even better when asked to rate the truthfulness of things that are not in their corpus to begin with.
What about an AI that can tell if that cute candidate our startup hired will sleep with me or if she’ll just lie and say yes and then tell HR?
And while we’re at it can we make an LLM that will force my kids to call me?
this is such a funny grift. hope ceos are torturing themselves over whether the random noise interpreters will like them. imagine an exec staring in the mirror repeating a line over and over to develop the right intonation to fool ai
Just train a model with your voice and never speak a real word on your own ever again. Call it voice purists. It’s going to happen.
I’m sorry, but that won’t help your earnings call. As soon as you give it a few microseconds of voice data, the model will simulate your life from first principles and find out your company is fucked. you think the ai is going to throw that information away? every exquisite subvocal pang of agony will be reproduced. there’s only one thing to do. the only way out is through. show up so blitzed out on coke you don’t even know you’re in an earnings call. you have to do it. it’s called charging the fucking machine gun nest man. our grandparents knew about this before they got all wrapped up in this tech shit. that’s what they taught you in world war two. they didn’t even know what a phone was back then. can you imagine? that’s fucking wild man. and now you have chatgpt and it’s smarter than half the people I know. that’s fucking wild. life! chatgpt. how do I buy a machine gun
ah, you’ve known some of the same type of idiot executives I have
is there a pseudoscience that VCs and promptfans aren’t trying to turn into a startup? we’ve got medical woo everywhere, AI startups are essentially repackaging everything from race science to mediums into a bullshit product, and now we’ve got this superstitious crap. there’s a drinking game somewhere in all this where you pull a random RW page and take a shot if there isn’t a startup trying to monetize the article’s subject
there’s a drinking game somewhere in all this where you pull a random RW page and take a shot if there isn’t a startup trying to monetize the article’s subject
perfect
@dgerard I mean, god forbid VCs did even the most basic due diligence. No, we’ll use AI to tell if this good news is true or not!
We need to filter people who exhibit voice stress, because no one likes a person with the humility of taking uncertainty seriously.
we need to filter anyone who uses earnings calls for anything other than comedy
I’d register
fuckedcompany.ai
but I happened to discover some years back that.ai
didn’t allow sayingfuck
in the domain name. goddamn tyrannybut there’s some real revivalist potential for fuckedcompany in all this dogshit
This is dousing again. I’m so glad nothing ever changes.
Bwahaha wifey points out that if only they had what this product purports to be before buying it.
E-meters.
I’m imagining that last Tesla earnings call with Musk holding 2 soup cans in a flop sweat.
Random thought: earnings calls are like streams. Buying/selling stock is subbing/unsubbing. Asking questions is superchatting/donating with a message. AI sentiment analysis is crazed fans hyperanalysing the stream to confirm whatever conspiracy they have about the streamer.
NB: i don’t partake in stream culture
an earnings call is very like a stream, yes
Just as cringe, for sure