Hacker Newsnew | past | comments | ask | show | jobs | submit | curriculum's commentslogin

Give it some spin with your index finger. Imagine putting a coin between your index finger and thumb, heads side up, resting on your middle finger below — not too different than how most people start a coin flip. With your index finger, rotate the coin in its plane, so that it stays “heads up” but the head is rotating. Now try doing this and simultaneously flipping the coin by flicking it with your thumb at a point on the bottom, close to the edge. If you do it right, you’ll impart a spin. If you impart a modest spin, the coin will never actually flip over, but will just wobble, and will therefore land heads up. An observer will likely not know what you did, because it is hard for the eye to tell the difference between a flip and a wobble at high speeds.


I'm not sure if avoiding Home Manager is the right choice for everyone, but it worked well for me.

Home Manager isn't necessary for declarative management of the user environment -- Nix flakes can do this, too. A long time ago, I kept a single `flake.nix` in my home directory describing the packages that each of my machines needed, and ran `nix profile install .#packages.<machine>` to install them into my user profile. By doing things this way, I learned a lot about writing flakes, and this transferred to other places I used Nix.

What this doesn't do that Home Manager does is dotfile management, but that's actually why I avoided HM originally. First, HM's approach is a bit clunky for my taste: each change to the configuration must be followed by running `home-manager switch` for the changes to take effect. I found this to slow down the edit-and-test loop when making changes to my shell config, etc. Second, the idea of doing all configuration in the same Nix language is cool, but most of the documentation found online about configuring, etc., `git`, will refer to the tool's usual method of configuration.

So instead, I made a quick Python script that manages package installation with Nix, and dotfile management with GNU Stow. The dotfiles and Nix configuration all go into the same git repository in my home directory, so they are tracked together. I've been using this approach to manage several machines for a few years now, and it's been more than sufficient for my needs.


In addition to that home-manager does not have have the same value NixOS modules have. You are mostly translating configs and not receiving big parts which are normally copied from the documentation for free.


One way to make the dotfile editing feedback loop faster is to tell home manager to create a symlink as opposed to writing a new file every time.

That way you don’t have to do home-manager switch when a dot file changes.


These days, home-manager has a NixOS module, so you can switch your system and user profiles with nixos-rebuild instead of needing another tool.


I think the idea is that California has plenty of solar generation during the day (or is on track to have plenty); what it needs is storage for when the sun isn’t shining.

The new NEM (the Net Billing Tariff) shifts the incentives away from solar generation (which the utilities have a lot of) and towards energy storage. I am in the market for solar right now, and I’ve been running the numbers. Whereas I would have had the greatest ROI with a large solar panel array under the last NEM, I now get the largest ROI with a small solar array + a battery.

I can’t say that my ROI will be the same under NEM 3.0 as it was in the old NEM, but solar is not suddenly a bad investment, as some might claim. A small solar + battery setup will pay for itself in 5 years in my situation. A battery alone (no solar panels) pays for itself within a decade, since you can buy energy for “cheap” during super off peak and store it for use during peak hours, pinning your electricity costs to the lowest of the day.

This is all with existing rates. The upcoming shift to an Income Graduated Fixed Fee will likely come with reduced per-kilowatt-hour rates, which will reduce the ROI for home solar and batteries.


California is moving towards a system which would introduce a monthly fixed cost based on your income, while simultaneously decreasing the cost of electricity per kWh (see: AB 205). Heat pumps become more attractive after such a change.


Taking money from person A and giving it to person B doesn't actually make anything cheaper.


My claim wasn't that an Income Graduated Fixed Charge would make your electric bill go down from what it currently is -- it could very well go up.

My claim is that, assuming an IGFC is implemented and the marginal cost of electricity goes down considerably, then a heat pump becomes cheaper to operate. You'll be paying the same fixed cost to be connected to the grid, whether you have a gas furnace or an electric heat pump. It might just be that the newly-lowered electric rates finally make a heat pump more cost-effective than a gas furnace.


Wait, so once they set your rate you can use as much electricity as you want and still pay the same rate?? That's insane. Crypto miners and EV drivers rejoice.


Not quite. Right now, (most) California residential electric bills are entirely volumetric, meaning that you pay for what you use — if you use zero, you pay (close to) zero. Under the new system, everyone’s bill will instead have a sizable fixed component (determined by household income) for being connected to the grid, and a volumetric component (determined by the amount that you use). The new price per kilowatt hour will be smaller than the existing rates.


Energy cost has long been viewed as a means to constrain consumption. This new approach seems to undermine that approach given the reduced cost per volume.

If I'm paying entirely based on volume, then making my home twice as efficient makes my bill half as much. But under this new system, I wouldn't realize the same savings.

Seems like a policy set with priorities other then environmental protections.


> Seems like a policy set with priorities other then environmental protections.

There are definitely other non-environmental considerations at play, the largest being that the increasing number of home solar installations has reduced revenue streams for utility companies, while at the same time their costs have increased due to grid maintenance and wildfire prevention projects (and lawsuit payouts). Most houses with solar are still heavy users of the grid, yet they pay very little towards its upkeep. This pushes the costs onto people without solar, who tend to be renters or low income households.

> Energy cost has long been viewed as a means to constrain consumption. This new approach seems to undermine that approach given the reduced cost per volume.

The idea that electricity consumption must be constrained makes sense when the electricity is generated by fossil fuels -- replacing a gas furnace with an electric heat pump when the electricity is made by burning coal is not a big improvement. But we're entering a world where most of the electricity is generated by clean solar, and constraining usage doesn't reduce emissions quite as much. In this world, a heat pump powered by solar is a real improvement over a gas furnace, environmentally-speaking. But a heat pump only beats a gas furnace in terms of cost to operate if the price of electricity comes down relative to the price of gas.

From that perspective, removing constraints on electricity usage is not a bug, but a feature.


Along the same lines, asking

> How many words are in the sentence "This is a test of artificial intelligence"?

yields an answer of:

> There are 8 words in the sentence "This is a test of artificial intelligence."

(There are 7).


My guess is that AI omitted 'a' because this is essentially how natural language processing works. Perhaps it cannot see 'a' because the input has been stripped of 'a' or 'the', and so on.


I think what you're referring to in your last paragraph is a CEREC crown, but it's not always cheaper. In my case, my insurance covers a lab-made crown with a $100 copay. It takes a few days to be made, so it's slightly inconvenient. Same-day CEREC crowns aren't covered by my insurance, and cost around $500-$800 at the dentists I went to.

I discovered this last year when I went to a new dentist and they recommended two CEREC crowns and a laser gum cleaning at a total cost of around $2000. This was surprising, because I thought my dental insurance was pretty good. Turns out, it is: the dentist was just electing to use more expensive methods that weren't covered by insurance and failed to tell me about the options. I did some research later and found that the laser cleaning was not demonstrably better than the traditional approach in terms of outcomes (though I think the gums are supposed to heal faster) and was not endorsed by the periodontist professional society. I saw some sources which said that the same-day CEREC crown is actually worse than the lab-made crown in terms of fit and durability.


Indeed, and this leads to another important interpretation: matrix multiplication is function evaluation.

Arbitrary functions which take in vectors and output vectors can be very complex and thus difficult to reason about. A useful simplifying assumption is that of linearity: that f applied to a linear combination is just a linear combination of f applied to each piece of the combination separately. Linear algebra, broadly speaking, is the study of functions of this kind and the properties that emerge from making the linearity assumption.

It turns out that, if we assume a function f is linear, all of the information about that function is contained in what it does to a set of basis vectors. We can in essence "encode" the function by a table of numbers (a matrix), where the kth column contains the result of f applied to the kth basis vector. In this way, given a basis, any linear transformation f has a matrix A which compactly represents it.

Since f is linear, to compute f(v) I could write v in my chosen basis then apply f to each basis vector and recombine. Alternatively, I could write the matrix A representing f in that basis, and then multiply Av. The two are equivalent: that is, Av = f(v). And so matrix-vector multiplication is "just" evaluating the function f.


As a counterpoint to your example of Seattle, Republican-led Fort Worth saw it's homicide numbers increase by 59% in 2020 (even more than Seattle) despite increasing their police budget over the same period:

https://www.star-telegram.com/news/local/crime/article248169...


It should be pointed out that you can install the nix package manager in Arch (or whatever other Linux, or macOS, etc.), so you can try out nix's declarative package management without actually switching to NixOS.


Doh! You're right, I completely forgot that. Thanks for pointing it out.


I tried this on my previous laptop and ran into a number of issues once I tried to install anything with a GUI. It's fine for shells and CLI tools though. I migrated some configs over to a Nix configs under Arch, and while it was a pain to initially set up NixOS (unfamilarity), it's a lot easier doing everything else now.


Yeah, I’m not sure how up-to-date my knowledge is, but opengl and the like are exceptions to the usual deterministic handling of dependencies on non-NixOS distros (not because it is unable to do so, I think it is mainly to avoid storing everything n-times with nvidia/amd), and one has to specify them. It was quite a time I ran nix on a non-nixos distro but there is this tool https://github.com/guibou/nixGL that meant to solve the issue of graphical programs.


nixGL works without issue on my laptop's Ubuntu install, and the project also includes an equivalent wrapper for Vulkan rather than OpenGL.


It's a good question. I'm very critical of my tools and I'm a 10+ year vim user, so I've thought a lot about this.

vim's "killer feature" is its text editing language and modality. But plenty of other editors and IDEs offer vim emulation to varying degrees of success. So this can't be the reason to use (neo)vim-the-binary as my editor over, e.g., VSCode.

So for me the real advantage of vim is in its flexibility. I can make vim into whatever I want depending on the context. Because of this, I'm able to use the same vim in many different contexts, instead of having different tools for different tasks. Ironically, this is kind of like the emacs culture of doing everything inside of emacs.

For example, vim is my code editor, of course. But I also have a keybinding to pop up my wiki and immediately start editing a note in vim. I have another keybinding I use when I'm writing a long piece of text in a textbox (like now) -- the binding drops me into vim, I write my text, and when I exit the contents of the buffer are immediately copied to the clipboard for pasting into the textbox. In each of these contexts, I have access to the same familiar environment with all of my configuration, keybindings, etc.

I could probably wrangle VSCode into doing each of these things, but it would be clunky. Vim owes much of its flexibility to its lightweight terminal interface. I wouldn't want to open VSCode every time I want to write a quick note, for instance.


You can do that literary everywhere and with any editor, and even without editor. I created this when I was using vim: https://github.com/majkinetor/vim-omnipresence and this to be used with any OS control: https://github.com/majkinetor/isense-x (here used for AHK intelisense but that is just 1 usecase)

The real benefit of vim vs any editor is modality and there is AFAIK nothing similar in any editor even with emulators. Editing/moving over text is just way faster with vim then any other editor (not just slightly faster, light years faster). I use vscode now because vim requires A LOT of setup and even when automated its pain in the ass. VScode and its plugin sync on the other hand makes it very fast to install anywhere with your config and keys so it compensate for slower editing to me since I need to have my editor EVERYWHERE (literary hundreds of computers).


Sure, you can make a hotkey to launch any editor, but not every editor is as "pathologically configurable" as vim. The combination of vim being terminal-based, highly-configurable, and lightweight makes it nice to use in a variety of contexts.


Its not lightweight.


It is, compared to pretty much any currently in-use editor.


> Editing/moving over text is just way faster with vim then any other editor (not just slightly faster, light years faster).

Text? Yes. Code? No. Since editing moving code is a subset of refactoring, and Vim has no knowledge about code.


Well, this update is all about that so I guess... solved ?


It's on its way there, true :)


> vim's "killer feature" is its text editing language and modality. But plenty of other editors and IDEs offer vim emulation to varying degrees of success. So this can't be the reason to use (neo)vim-the-binary as my editor over, e.g., VSCode.

"varying degrees of success" is rather short of "sufficiently to not cause problems"; in my experience they're only good enough to get into the uncanny valley of "will this work" hesitation before every keystroke outside the most common.


> y. I can make vim into whatever I want depending on the context. Because of this, I'm able to use the same vim in many different contexts, instead of having different tools for different tasks.

Better to have specialised tools for whatever they are good at than a tool that's isn't good at any, and you have to spend time cmolding it to be a pale imitation of specialised tools.

> I wouldn't want to open VSCode every time I want to write a quick note, for instance.

Why would you open VSCode? You open a note-taking app.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: