Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why on earth was my title replaced? Sigh.

Revolutionary memristors purposely delayed to prevent cannibalization of flash

It's 100% accurate and more relevant.



It makes no sense to me.

What's the advantage of memristor-based storage, exactly?

If it's cheaper to make, then sell storage at a price that undercuts the competition but still makes a better profi.

If it's more reliable, then sell storage at a higher price with a better warranty.

There's no sensible scenario where memristors are ready to go but they don't want to sell any memristor products yet because it will hurt existing sales. Either they can make a better product, or they can't, and if they can't make a better product then the stuff simply isn't ready.


It's the same price for double the storage, less power and likely more reliability.

http://www.zdnet.com/blog/foremski/tha-amazing-memristor-bey...

They are probably delaying to get their profit out of existing investments in tool and die for their factories, as well as high inventories.

While all the manufactures are likely not in cahoots, it's probably like the airline industry where they look to the left and right and nod at each other and change fees right after each other.

Also notable how hard drive prices have not dropped even though supply is completely back to normal.


Storage can be as reliable as you want, you only need to throw parity at the problem. Flash is one of the most unrealiable technology today, chips are ofter rated at only 10000 writes before bit-errors, flash is used as storage today thanks to advanced Error-correction (ECC) algorithms. BTW spinning hard-discs also use ECC since many years ago but it is much less powerful.


The advantages of memristor technology are many. In ordinary conventional non-volatile storage it offers densities higher than flash and speeds approaching SDRAM. I'll let you work out the implications of that on your own.


> I'll let you worj out the implications of that on your own.

If those are reasy HP and partners can sell them at a premium.


That doesn't necessarily help. For any technology there will be a price vs. expected sales volume curve and thus a sweet spot that maximizes revenue. It's possible that people have looked at enough numbers and figured out that the interaction between the demand curves for memristors and for flash is such that they can more easily maximize revenue by delaying memristors.

For example, pricing memristor based drives exceedingly high will still likely put them well within the price range of ultra high-end flash based enterprise level drives (such as card based multi-terabyte SLC based drives), and so will still result in a canibalization of flash sales.


what if they have a bunch of flash products already made? They need to get rid of that product first before they make it worth less.


They're talking about delaying commercialization from a year out to sometime farther. There's no way they have a year's worth of flash products sitting in inventory.


It's not only products siting in inventory. It's also:

1) products sitting in store fronts and retail, that will be unsold if they announce something new.

2) Investments (in the billions) they have made for flash production. You don't just kill all the money you have invested to build a flash plant because there's a new better technology. Not if you can delay the introduction of said technology until those costs are absorbed, and the investments have paid for themselves.

3) materials they have ordered for flash production. Volume deals to manufactures for metals and the like can run not just one year ahead but several.

Selling the old stuff for a while is not even about price-fixing, if you think at this level, it's something natural that benefits all companies involved.

They all have: factory stock, retail stock, investments in flash technology and volume orders on materials to absorb before making the jump.


This all strikes me as an example of a sunk-cost fallacy. Either memristors will make more money, or they won't. If they will, it doesn't matter that you have a billion dollars invested in X that won't be useful anymore, because you've already spent that money. If memristors won't make more money than making flash products using existing infrastructure, then it's not ready for commercial use and there's no conspiracy behind the delay.


This is very clearly not the sunk cost fallacy. In this case, they are delaying making money (thanks to collusion + patents most likely) and they are still going to recoup their currently 'sunk' costs.

The sunk cost fallacy is continuing to put money into something explicitly because you're counting the previously sunk costs instead of justifying it by "the remainder to be spent < the expected return value." In this case, that's not the calculation that's being made at all.

Anyway, still enjoyed your comment :)


>This all strikes me as an example of a sunk-cost fallacy. Either memristors will make more money, or they won't. If they will, it doesn't matter that you have a billion dollars invested in X that won't be useful anymore, because you've already spent that money.

That's nothing like the "sunk cost fallacy". For one, in the sunk cost fallacy you cannot get your "sunk costs" back.

But those companies CAN (and will), sell their old flash tech stock.


>There's no sensible scenario where memristors are ready to go but they don't want to sell any memristor products yet because it will hurt existing sales.

It's not about existing sales, it's about stock. If you have built a lot of stock of traditional devices (including materials and infrastructure for manufacture) while waiting for memsistor tech to be ready for prime time, you want to sell it first before you put out the new products in the market.

But surely, competitors will jump right in, right? Unfortunately, there are that many players. Similarly, they can't "undercut the competition" because there is no competition, just a few players that "understand" each other.

Storage/memory manufacturers have had a few price-fixing scandals over the years.


> It's not about existing sales, it's about stock.

They keep a year of sales in inventory? That's a lot of money to have tied up.


According to Wikipedia, "(March 23, 2012) A team of researchers from HRL Laboratories and the University of Michigan announced the first functioning memristor array built on a CMOS chip for applications in neuromorphic computer architectures." - this technology is two-three years away from being available in consumer packages.

You should expect to be able to purchase something (for more than the equivalent flash package would cost you) in 2015-2016. If they play out the way they should, the price/performance slope should cross over with flash around 2017-2018.

These new technologies usually take about 10 years to come online at volume. No conspiracy required.


I fucking hate that mods keep doing this.


Where do you draw the line if they didn't? If everyone was allowed to submit whatever title they wanted, it would quickly degrade into Digg-quality submissions. In rare circumstances, it is necessary to change titles. A piece of content might not have any title at all, so needs one. It is better for the health of HN that mods act on the conservative side.


Maybe something HN does automatically to prevent editorializing. Not saying that your title in particular was doing that.


The ridiculous thing is it's not like the original headline wasn't editorializing.


It was replaced because the original title was perfectly fine, and your version was a bit editorialized. I'll grant that you provided a good summary, but the "revolutionary" word is an adjective. Some people, including myself, would not find memristors to actually be revolutionary. They are an important technology advance, but that is all. Penicillin was revolutionary, as was the Magna Carta. The word is too-oft used for things that really are not revolutionary at all.


A lot of people think memristors have the potential to change computing in fundamental ways (requiring new architectures) and bring forth true AI. I'd call it revolutionary.

http://spectrum.ieee.org/robotics/artificial-intelligence/mo...

http://www.neurdon.com/category/synapse/

http://www.eetimes.com/electronics-news/4088605/Memristor-em...


The idea that memristers will bring forth true AI is just silly. It's all just turing complete computation, and the blocker on true AI right now isn't about speed. Otherwise we could still run it, just slower. And not actually that much slower, for that matter.

And that new architectures would be required is only interesting if you're a breathless industry press reporter. New architectures are required all the damn time for all sorts of things. Intel, AMD, and nVidia all pump them out on a yearly frequency, give or take a bit depending. Flash drives, even without memristors, are going to rearchitect "the cloud" in the next couple of years. Architectures flow like water in this industry.


Memristors have the interesting property of adapting their resistance over time as current flows through them. Thus they can mimic Hebbian learning, a fundamental property of synapses, in a way that other electronic components, cannot easily do.

No one knows what the architecture of the first AI will look like. However, our current architectures have a big problem (for creating AI) in that they have a very clean separation between data and code. Code is run in the CPU and alters data on RAM and disk. Brains don't work this way -- there is no separation between data and code. For this reason I think memristors -- while not a silver bullet -- might represent a step in the right direction.


"However, our current architectures have a big problem (for creating AI) in that they have a very clean separation between data and code."

That's a security measure, rather than a fundamental property of transistors. We had to add that property to our architectures.

Simulating Hebbian learning isn't that difficult with transistors and conventional computation, so you don't get that big an advantage right now. If we were brushing up against the fundamental limits then it would matter, but we're a ways away from that yet.


A lot of people have thought a lot of things had the potential to change computing in fundamental ways.


Don't crush my dreams man.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: