Hacker Newsnew | past | comments | ask | show | jobs | submit | jonhohle's commentslogin

This is neat. I haven’t looked into it, but I would think relative offsets could still be an issue, but it seems there must be some translation layer/mmu since the codegen will be different sizes anyway. This would impact jump tables and internal branches, primarily.

I mostly work on stuff from the 90s, but disassemblers make a lot of assumptions about where code starts and ends, but occasionally a binary blob is not discoverable unless you have some prior knowledge (pointer at a fixed location to an entry point).

I would think after a few passes you could refine the binary into areas that are definitely code.


Missing The Terminator. Also applies to Wipeout, a game with some of my favorite logo and design work.


I like how the first like made the entire mobile browser go yellow, even the buttons. How did they do that?

Have you ever met a security engineer? I’ve never met one who was also a good engineer (not saying they don’t exist, I just haven’t met one). Do they find vulnerabilities? Sure. Could they write the tools they use to find vulnerabilities, most probably not.

I was trying to use Orca Slicer (which itself is intractable) and it had a combo button whose menu was disconnected from the button. The menu would disappear as soon as the cursor left the button boundary, but because it was disconnected, there was no way to get to the menu without leaving the button boundary, traveling a void, and then getting to the menu. I’m unsure what incantation allowed me to finally choose the right command, but forget how it looks, it was if no one even tried to see if it works.

Most fun is when the menu opens both on button hover and on button press, but if the menu already opened, clicking the button closes it instead, so the first 2-3 you use it, you end up opening the panel and closing it immediately.

Not sure how stuff like this gets deployed in the first place, guess we're just a few people left who test things we develop before we push them to the public, I'd rather believe that than that people just don't care anymore...


I feel like the modern web/app ecosystems have forced developers onto a red queen type treadmill, so software never really matures. They often build up to 70% of the features they want, the codebase gets intractable because of all the crap they have to deal with and they start over.

I love software like Gimp, Blender, Inkscape, etc, that matured over decades and kept their soul.


Potentially keyboard arrows?

> a flash of lightning following the boom

That’s not how lightning and thunder work.


^f lightning ;)

Honestly, feels more like a bit. I sometimes say I need to cross my i's and dot my t's to suss out who's still paying attention in a meeting...


I never moved to Homebrew, and never understood the appeal. It’s refreshing to see people coming back to MacPorts after the last decade.

If you ever try to install any packages from GH or an indie, you only get brew install/cask instructions. It's game over.

Regarding the appeal, this probably exists in Mac Ports, I do not know since you guys reminded me it still existed, but Brewfile lets me provision a new Mac very efficiently.


As someone with a demand charge, let me disagree. The worst single hour out of 60-70 throughout the month is used and in my market it’s about $19/kw demand. Turn on the AC once during that period, $80. Happen to have to have an oven or microwave going at the same time, you’re probably over $100. For one hour on one day of the month. Once you screw your month, you’re free to do it the rest of the month, but it only takes once.

What you have described is an obviously terrible system that doesn’t incentivize lower power consumption.

That’s not how it’s going to work in Nevada. It will be the highest 15 minute period of each day, so if you spread out your power usage you have room to game the rates and save money. And if you have a bad day it will only cost you a dollar or two and the next day is fresh.

Plus it’s not on top of the total consumption. The consumption rate is getting cut so that people should be paying roughly the same amount as before.


> What you have described is an obviously terrible system that doesn’t incentivize lower power consumption.

Doesn't it? Suppose you have a battery system which has access to the current price, so it charges when it's cheap and discharges when it's expensive. Then you don't pay the $19/kWh, you run on batteries then -- or sell at $19/kWh. And thereby turn a profit from installing the battery system, creating the incentive to reduce consumption when the price is high.


A system where you are incentivized to give up for the rest of the month if you were to high once doesn’t make sense. I am skeptical that the description was accurate.

It’s a peak demand charge. They look at the highest usage hour from 4-7pm every weekday. The highest number adds a charge billed at $19/kw demand.

https://www.aps.com/en/Residential/Service-Plans/Compare-Ser...


It doesn't really incentivize that, but it doesn't punish it either

You would have to be really sure there wasn't going to be a higher peak later in the month


If the demand charge is always from 4-7PM and you have at least three hours worth of batteries at your own peak usage then your usage during that period can always be zero, because you have the other 21 hours in the day to charge them back up for tomorrow.

It’s not just popular projects. On a small utility I have I received a PR that was more lines than the project had. I’m happy to be a good maintainer, but reviewing something that’s effectively an AI rewrite isn’t something I care to review and since I can’t vet it, can’t blindly accept it.

I’m sure it’s all over, I was assuming the smaller projects could deal with the handful of contributions.

Something like a big emulator is very complex and has a LOT of motivated users who aren’t going to be able to make quality submissions.

So they get it in volume where it may be nearly impossible to deal with.


Prior to AWS at Amazon hosts were provisioned as “host classes” and typically operated on in that way. We were encouraged to make them “touchless”, which meant the infrastructure team could replace that host without contacting the team first. The deployment tool deployed to host classes (though you could put an individual server there if you wanted). EC2 wasn’t quite the same, but not very foreign either. We didn’t originally even use the AWS interface (at the team level). They were managed by a team working on the transition.

Unfortunately, I think Google is in the process of killing the golden goose. I visit so few unrecognized websites now and primarily rely on “AI mode” to answer my specific question rather than sift through a handful of possibly accurate pages. How long can that go on before those sites just no longer exist and the source of that knowledge or new knowledge evaporates. Doesn’t seem like that model is sustainable long term.

Honestly, I think the SEO virus killed that golden goose long before the first AI chat bot. If we still had good search taking us to sane websites, ChatGPT might well have never been a thing. I was posting (including on HN) about the vulnerability of Google's search business years before AI chat. It just happens to be the thing that filled the gap when usable search disappeared.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: