You are not wrong! An entire subclass of anomaly detection can basically be reduced to: forecast the next data point and then measure the forecast error when the data point arrives.
Well it doesn’t really require a forecast - variance based anomaly detection doesn’t make an assertion of the next point but that its maximum change is within some band. Such models usually can’t be used to make a forecast other than the banding bounds.
Once you hit a certain scale anything on the AWS console becomes impossible to use. It’s one of the reasons most high to very high scale companies use something else to manage cloud resources. See: Spinnaker and Asgard for Netflix specific examples, but also Terraform and the many alternatives.
I work for an organisation that is at this scale on AWS. Of course, we use Terraform, but the cost explorer and tags and so on are still usable (and useful).
For <business reasons> we have in the order of 10^6 SNS topics in one account+region pair. In that account, while in that region, the SNS console is entirely frozen. I suspect they're doing some looped pagination polling in the background and fall over on our degenerate use case.
Edit: I can also back up the other parent comment - tagging still works fine (and I know for a fact the stability and scalability of the tag system is taken more seriously than most things within AWS)
For those with more experience how does (Num)Pyro compare with PyMC? I haven’t had the good fortune of working with any of these libraries since before Pyro (and presumably numpyro), and with PyMC3 back when it used Theano under the hood.
Are the two libraries in competition? Or complimentary? I’ve been playing with PyMC for a personal project and am curious what I might gain from investigating (Num)Pyro?
I would say that, at least for me, PyMC’s main advantage was in DX. I just found model construction much more straightforward and better aligned with how I wanted to assemble the model.
I tired both a while back, but nothing too big or serious. One thing that numpyro benefits from is JAX's speed, so it might be faster for larger models. Though PyTensor, which is the backend for PyMC can apparently also generate JAX code, so the difference might not be drastic. The PyMC API also seemed to me easier to get started with for those learning Bayesian stats.
One thing I remember that I disliked about PyMC was the PyTensor API, it feels too much like Theano/TensorFlow. I much prefer using JAX for writing custom models.
You'll lose a lot of the PyMC convenience functions with Numpyro but gain a lot of control and flexibility over your model specification. If you're doing variational inference, Numpyro is the way to go.
You can use the Numpyro NUTS sampler in PyMC with pm.sample(nuts_sampler="numpyro") and it will significantly speed up sampling. It is less stable in my experience.
This is maybe not the place, but we did some apples to apples comparisons between PyMC, Dynesty, and the Julia Turing.jl package.
A little to my surprise, despite being a Julia fan, Turing really outperformed both the Python solutions.
I think JAX should be competitive in raw speed, so it might come down to the maturity of the samplers we used.
I agree with you on the PyMC situation. There has been many changes to the backend engine, from Theano, to TensorFlow, back to Theano, then JAX and so on, that it becomes a little confusing.
PyMC can use NumPyro as a backend. PyMC's syntax and primitives for declaring models are much nicer than (Num)Pyro's, as is the developer experience overall. But those come at the cost of having to deal with PyTensor (a fork of a fork of Theano), which is quite bad IMO, instead of just working with Numpy or PyTorch.
Similar experience. I’m in my 30s and many peers in the same age group (add a standard deviation in each direction, but skew to the left) “don’t know how to cook” and door dash daily, sometimes dessert is a separate order made later. When I mention that it sounds wildly expensive it doesn’t even register. There’s a pretty large gap in the assumed baseline between people who don’t use these services and people who rely on them.
Yeah it might not even be entirely an age thing, it's the "many folks who use these services .... REALLY use them". There might be other factors at play to match up who is how.
> It's almost like the banking system was designed by rich people to suit the needs of rich people or something.
Food for thought: If they didn’t have an investment manager before this, and their primary asset was a big house they couldn’t afford… these people weren’t rich. They were working class, over extended themselves into a house, and got lucky.
Your premise may have some validity but the story in this thread may be an example of a bank making a working class family rich.
> I happened to look at a slide deck from Sandia National Laboratories from 2007 that someone had posted on Reddit late last night (you know, as one does, instead of sleeping), and one particular slide jumped out at me:
The author is making fun of themselves for being up late reading this deck instead of sleeping. They’re not making fun of the person who posted the slide deck.
Ah. How could I be so foolish as to think there might be some nuance to the situation! For what it’s worth a lot of friends in finance are/were bummed about how this law might affect gardening leave. [0]
That does sound anti-competitive. Imagine if Google poaches hot people in industry and then just sits on them during a critical market time, doing nothing, because they don't have any particular work for them, they just don't want them working for the competition.
Those garden contracts sound like a variation of the same idea.
Anyways, if I understand FCC's position right, they also had nuance for specific non-competes.
at least with garden leave the person is at least getting compensated. it's substantially better than a minimum wage worker at ~~Subway~~ Jimmy John's being told they're not getting paid and they also can't make sandwiches for six months.
That's one scenario I'm afraid of... that businesses have rational incentive to push boundaries for workers that aren't ultra high value, and that terms like these basically force you to stay at Subways.
I'm not on the "100% of non-competes are harmful". But definitely 98-99%. I agree with California's exceptions:
>The only exceptions are non-compete or restrictive covenants that fall within one of the narrow exemptions authorized by statute, all of which relate to the sale of the goodwill of a business, or of a substantial ownership stake in the business. Courts interpret these statutory exceptions very narrowly.
If you have enough skin in the game that you can make life or death decisions for a company (or take out money in stock that can sink the company), you probably have enough money that a non-compete won't put you on the streets.
VTSAX isn’t a free lunch, but you pay in risk rather than time spent managing real estate. It’s passive assuming you can handle drawdowns whereas real estate is passive assuming you think running a rental business is passive.