it’s simply that Back has nothing to gain to claim to be Satoshi. It would make bitcoin a lot more volatile. He even said just now
> I also don't know who satoshi is, and i think it is good for bitcoin that this is the case, as it helps bitcoin be viewed a new asset class, the mathematically scarce digital commodity.
if he has the key he could buy security (move the money to a multisig, etc)
if he doesn't have the key then of course he's motivated to make sure bad people don't think otherwise (but the usual thing about bad people is they don't really have a habit of giving up without a fight, so if someone really thinks he has the keys he cannot do shit to convince them otherwise, so he's again back to hiring security)
I didn't realize anything. Everyone and their mother has been saying that about Satoshi since day one... Also, Adam would have infinitely more to gain from being Satoshi than from Satoshi remaining unknown. He's been trying to take credit for Bitcoin ever since he realized that Bitcoin was actually worth something (he initially dismissed it). All his actions point to someone who's largely motivated by financial gain whereas Satoshi hasn't touched a single of his 1M+ BTC.
Maybe :) But seriously, Satoshi is most likely a relatively unknown dude. Why would you spend so much effort hiding your identity and then become a very public figure of the Bitcoin community?
I founded a company with Adam. He is not Satoshi. Neither is Szabo. Best not to speculate publicly as no good comes from it - not least of which there are organized crime rings targeting potential suspects for violent kidnapping or home invasion.
you literally detailed why you think it’s not him in another comment. You can’t even get your story straight. Give actual evidence or keep your mouth shut for everyone’s sake since it doesn’t add anything to the discussion
As per the quotes, it could've been that he had read them, liked them and kept repeating them. However given other matching circumstances such as grammar this becomes unlikely. Also, this is just a single journalist; to know precisely this should be outsourced to a company doing forensics.
I understood minimal actions intuitively, it just made sense? The stamina meter was shrinking with each step, so I recognized it was something to look out for.
It needs to pass the most basic concept of learning, which it can’t currently do. Probably wont ever do after listening to dario on his latest podcast run.
Where we are at today is ASI (artificial semi-intelligence). Maybe in 20 years artificial super intelligence can be achieved, but certainly not AGI.
Yeah it is like putting a huge red button in the middle of an otherwise empty room with a small "don't press the button" sign on the wall behind you when you enter the room.
Truthfully I find sonnet-4.5 better at Rust code than Codex (medium/high). Haven't tried anything else (like react/typescript) since I only use AI for issues/problems I don't understand.
Because that's the definition that is leading to all these investments, the promise that very soon they will reach it. If Altman said plainly that LLMs will never reach that stage, there would be a lot less investment into the industry.
Hard disagree. You don’t need AGI to transform countless workflows within companies, current LLMs can do it. A lot of the current investments are to help with the demand with current generation LLMs (and use cases we know will keep opening up with incremental improvements). Are you aware of how intensely all the main companies that host leading models (azure, aws, etc) are throttling usage due to not enough data center capacity? (Eg. At my company we have 100x more demand than we can get capacity for, and we’re barely getting started. We have a roadmap with 1000x+ the current demand and we’re a relatively small company.)
AGI would be more impactful of course, and some use cases aren’t possible until we have it, but that doesn’t diminish the value of current AI.
> Eg. At my company we have 100x more demand than we can get capacity for, and we’re barely getting started. We have a roadmap with 1000x+ the current demand and we’re a relatively small company.
OpenAI's revenue is $13bn with 70% of that coming from people just spending $20/mo to talk to ChatGPT. Anthropic is projecting $9bn in revenue in 2025. For nice cold splash of reality, fucking Arizona Iced Tea has $3bn in revenue (also that's actual revenue not ARR)
You might have 100x more demand than you can get capacity for, but if that 100x still puts you at a number that in absolute terms is small, it's not very impressive. Similarly if you're already not profitable and achieving 100x growth requires 1,000x in spend, that's also not a recipe for success. In fact it's a recipe for going bankrupt in a hurry.
I have no idea if OpenAI’s valuation is reasonable. All I’m saying is I’m convinced the demand is there, even without AGI around the corner. You do not need AGI to transform countless industries.
And we are profitable on our AI efforts while adding massive value to our clients.
I know less about OpenAI’s economics, I know there are questions on whether their model is sustainable/for how long. I am guessing they are thinking about it and have a plan?
This is correct, it should burn the retinas of anyone thinking that OAI or Anthropic are in any way worth their multi-billion dollar valuations. I liked AK’s analysis of AI for coding here (it’s overly defensive, lacks style and functionality awareness, is a cargo cultist, and/or just does it wrong a lot) but autocomplete itself is super valuable, as is the ability to generate simple frontend code and let you solve the problem of making a user interface without needing a team of people with those in-house skills.
There are many more use cases that aren't fully realised yet. With regards to coding, LLMs have shortcomings. However, there's a lot of work that can be automated. Any work that requires interaction with a computer can eventually be automated to some extent. To what extent is something only time can tell.
Sure, but you don’t need AI to automate computer work. You can make a career out of formalizing the kinds of excel-jockeying that people do for reports or data entry
This is a relatively reasonable take. Unfortunately, that's not what most AI investors or non-technical punters think. Since GPT 1 it's been all about unlocking 100%+ annual GDP growth by wholesale white collar automation. I agree with AK that the actually effect on GDP will be more or less negligible, which will be an unmitigated disaster for us economically given how much cash has already been incinerated
We’re a regular old SaaS company that has figured out how to add massive value using AI. I am making no statements about valuations and bubbles. I’m actually guessing there is some bubble / overhype. That doesn’t mean it isn’t still incredibly valuable.
reply