Hacker Newsnew | past | comments | ask | show | jobs | submit | randomrubydev's commentslogin

We will finally see a mobile device running a PS3 emulator RPCS3 soon...


> There’s a reason pretty much any real tech company doesn’t use nosql except in very clear explicit cases.

That is such a bold claim that is obviously not true.


Can you give a single example of a major tech company using nosql as its primary db for any application?


MongoDB hss 26500 customers worldwide (and we are one NoSQL vendor).

These customers include:

Bosch : https://www.mongodb.com/customers/bosch

HSBC : https://diginomica.com/hsbc-moves-65-relational-databases-on...

SEGA Hardlight : https://www.mongodb.com/blog/post/sega-hardlight-migrates-to...

HMRC : https://www.mongodb.com/blog/post/mongodb-microservices-help...

DWP : https://www.mongodb.com/customers/department-for-work-and-pe...

Liberty Mutual : https://www.mongodb.com/blog/post/liberty-mutual-iac-mongodb...

MetLife : https://gigaom.com/2013/05/07/with-300m-earmarked-for-tech-i...

There is a more complete list here : https://www.mongodb.com/who-uses-mongodb

That list is limited is just the customers that are willing to be public references.

Every mature NoSQL vendor has a similar list.


Pretty much all of AWS and Amazon runs on Dynamodb.

Google has Firestore as one of their approved databases for internal use.

Those are ones I have first hand experience with.

I know from friends that Microsoft runs things on Cosmos.

Netflix is pretty invested in Cassandra.


stripe


The M1 is essentially a large version of the A14 chip and uses the same firestorm and icestorm cores. With such unsubstantial improvements in the A15, the M2 should theoretically also have unsubstantial gains because it will be a large version of A15. The article also mentions that some key technical staff jumped ship so that might somehow halt innovation.


How substantial really depends on the workload. It's small in single-thread performance, but it's easy to grow it with more cores and I'd love to see a lot of background processes moving to more icestorm cores to keep the firestorm ones free.


While IPC is probably not a huge bump, the brute force of doubling the System Cache should yield decent improvements in a lot of workloads on the laptop side.


I would argue that it's a pretty good measure of programming and CS problem solving skill with weak alternatives.


It's much better to give a candidate a simplified version of your typical daily task. Give them enough time so they can google and learn if needed. That usually means very simple task that you could solve in hour or two at your leisure at home.

Now, I do get that there's a lot of people who don't like spending time at home for interview tasks but when you think about AND it's not skewed to extreme (say, big task 8 working hours worth) then, in terms of time wasted, it's not such a big difference. Interviewer can then see the code quality, can talk about it with candidate, clarify some missing pieces or pitfalls found, etc.

IMHO most important is not if the candidate knows how to solve some hard or even medium problem when I speak to them and they are stressed enough already. What is important is if they're willing to learn, if they know how to search for stuff they may not know and if they can produce performant enough, but excellent to read, code.


It depends what you're looking for. If you want someone who can turn the handle on your typical daily task then, sure, test them on your typical daily task. But if you want someone capable of developing solutions to brand new problems then it's not so easy and testing fundamental computer science theory is important.


It’s not. Theory can be referenced. People do not work in a vacuum.

Interviewing in eng is broken, but afaict its a “worst solution save all others” kind of scenario.

But let us not begin to deem these intrinsically important.

Some of the most creative and productive coworkers I’ve had struggled with leetcode style interviews. They’re a bad tool for anyone who isnt a new grad, and even then.


When you apply for jobs do you simply look for "engineering" positions? Why am I always applying for software engineering and not electrical engineering? It's all engineering, and theory can be referenced, right? In fact, why doesn't everyone just buy a book and become a top engineer?

The point is not (or shouldn't be) to recite a textbook. The point is you can navigate your way around the textbooks. I've got both The Art of Computer Programming and The Art of Electronics on my shelf. I could find the sections to help sorting a list in seconds. As for the latter, I have no idea why the majority of that book even exists. I can't call myself an electrical engineer, even though all the theory I need is within arm's reach.

I assume you're arguing against the "recite the textbook" approach. I would agree that this is not the way to do things. But equally, "throw the textbooks out" is not the right way either. We need to evaluate a high-level grasp of the literature/theory but don't punish for forgetting minutiae. I might ask a candidate to talk about choice of sorting algorithms. There is, of course, no perfect answer, but what I'll be expecting is general evaluation of algorithms: time/memory tradeoffs, probing for more domain knowledge (e.g. does the data often come in sorted or random), platform constraints etc. I won't even expect a name drop of an actual sorting algorithm as that's not really the point. What they're telling me is they know why Knuth has a whole chapter on sorting. That's the important thing.


This is a false dichotomy. Specific theory that is hard (and useless) to memorise all details can be easily referenced if you are knowledgeable enough in a field, if you know about a red-black tree, the gist of its properties you can easily Google usage cases if you've forgotten, examples of it and algorithms related to it (rebalancing, how it relates to search, etc.), if you had never studied, used or seen one there is no way to reference to these properties easily.

I'd much rather hire and work with someone who has the skill to easily assess a situation and use referencing to rebuild knowledge than someone who memorised how to implement tree balancing, so why do we test for the latter rather than the former?


Some companies want to test if a person spent time preparing for the interview. So asking all those quiz questions does make sense even if they are no relevant. At least it shows that the person knows the rules of the game and is willing to invest substantial efforts to follow them even if the rules are arbitrary and irrelevant for day-to-day activities.


Okay. So arbitrary preparation - when nearly any other professional interview requires little preparation beyond updating your resume - has merit because... rules of the game?

Stop supporting baseless metrics for assessment just because some old person used them before you showed up. We can and should do better.


This is how it is with IT companies paying well above average. Given that they are able to pay such salaries this interview strategy is compatible with big profits.

It could be that by changing interview strategy to look more similar to other professions that profit can be increased even farther, but nobody is risking it.


I can use your argument to push to another side: wouldn't this strategy also tell us a huge bias that it is selecting for and presenting itself in tech companies? With that I mean the bias of "learning to play the game", selecting for people that are going to conform to arbitrary rules for their promotions, caring about playing the game instead of analysing the impact of their work?

And I can ask that given the recent issues with data privacy and data abuse by the tech giants, would we be in this place if the interview processes had selected for more holistic engineers, technically able but that refuse to play the game just for the sake of playing the game, that are opinionated and don't conform to something just for the sake of money?

I know that I might be creating a false dichotomy but I would like to think about what kind of pressure this selection process creates, what biases arises from it? How can we make it better?

Because your argument is the most conservative and pro-establishment one: it works so don't touch it and just emulate.


I was not arguing for these types of interviews. My point was that one can rationally explain apparently useless quiz questions. And yes, this is a strong selection bias to pick people that agree to play by arbitrary rules without questioning them.


I’ve worked at several shops paying well above average with interview processes that hinged on more representative work.

Not everyone is playing the absurdly doofy “game,” just most.

Local maximum that laziness has us trapped in. Nothing to do with merit.


Being able to refresh one’s memory / reference previously learned approaches is not akin to learning them from scratch. Your opener is preposterous.


> It’s not.

Yes it is.

> Theory can be referenced

How do you know that the person is even able to comprehend theory?

> Interviewing in eng is broken, but afaict its a “worst solution save all others” kind of scenario.

That's your opinion.

> Some of the most creative and productive coworkers I’ve had struggled with leetcode style interviews.

Good for you. But "slumpt_'s most creative and productive coworkers" is not a good metric for hiring.

> They’re a bad tool for anyone who isnt a new grad, and even then.

Again, that's your opinion. I'm not a pro in those interviews, but studying DS and algos opened up and pushed my mind to its limits like nothing else. Your whole thinking process changes when you start working on this, you start thinking about constraints, performance implications, pro and cons of different approaches. It is called Computer SCIENCE for a reason.


The person who's going to come up with new idea's isnt spending their time memorizing old ones. They learn to index where to retrieve knowledge when necessary in order to allow them to cover a wider breadth of knowledge. And this will allow themand to pick the best one for the job at hand as opposed to the tool they are an expert in. Sometimes you need a handyman instead of a master plumber because they are better able to see the big picture beyond all the shit.


My best coworkers is about as good of a metric as things that “stretched your mind to its limits.”

The point is neither is demonstrated to bear any relationship to jack shit

Interviewing is and has been broken, even with the changes we’ve made over the years.

If you’re holding onto leetcode challenges that make you think hard as representative of engineering prowess we’re never going to have a reasonable conversation.


There are plenty of people who have a fantastic knowledge of CS theory and are pretty useless at solving real world problems.


Again, it depends what you're looking for. If the real world problem is "we need a fast optimising compiler that runs on our embedded platform" then hiring someone who is great solving problems but knows nothing of compiler theory is going to be very inefficient.


> solving real world problems

Define this first.


changing color of a button in an Electron app, or moving JSONs back and forth (from backend to frontend)


Most of these types of algorithms already have tons of research available online as people try to figure out what the lower bound of optimization is. It's far more telling to just talk about previous projects the person has worked on to gauge their level of competence. Asking them to explain why they made the choice they did vs trying to see how much they can memorize tests two different skill sets. The person who makes better choices is the one you want to hire.


How does someone talking tell you if they can actually do basic programming... I think you would be surprised at the number of people that apply for software engineering jobs but barely know how to program.


To be fair, I know a fair number of people who are good at competitive programming but are absolutely awful at writing maintainable code.


Yeah, those competitions are not really representative of actual skill.

I was into competitive math as a teenager and was somewhat successful, but I actually kind of suck at math.

Similarly, I'm a professional developer but I'm really bad at competitive programming: what usually happens is that I know how to solve the problems but the time limit is too low (for me, at least).

I'd say success in competitions is a good indicator of dedication and perseverance, but not sufficient to spot someone who's good at the job.


Yeah, competitive programming forces you to use short variable names and write makeshift code which is fast enough to pass all the test cases ...


Because you can determine if they organize and test things in a repeatable and maintainable way or do they have trouble organizing structures and make questionable performance decisions. Are they clear on the hasA vs isA. Do they know what a mutex or static scope is? These are the things that will cause huge debugging nightmares. Syntax issues are no where close to as problematic so why use whiteboards vs an actual computer? In my experience of interviewing, questions about Security and Threading(performance / micro-opts) are good for separating the wheat from the chaff.


By the same way we have doctors do surgery in place, construction workers do a toy house, teachers give a class for free, cooks spend one day serving meals for free,...


Are these mutually exclusive? I've never been through an interview that didn't ask questions about previous work regardless of how the technical test was structured.


You'd think so, but from my experience the folk that are very interested in algorithmic design and so forth produce highly abstract and hard to understand solutions to simple problems, which, in the end, are the majority in the regular dev work.

Sure if you're applying for a job that really demands algorithmic design skills it should be a great asset but in general the most valuable skills any programmer has is producing simple and robust code that works and others can continue building on. I don't deny that knowing algorithmic design skills well helps a lot but it does seem to feed the egos of the programmers to produce overly complicated solutions.


This is a nice point -

I'd answer this type of algorithm question in truth by identifying the relevant library wherever possible, not by coding it myself, and I'd strongly expect anyone I was working with to do the same.


"I always code my own AES libraries because I'm an expert" - said no expert ever.


It's a much better measure of how many leetcode DP problems you solved while grinding interview prep. I'd argue it's a pretty poor measure of software engineering ability.


Can't read it because I've apparently reached number of articles I can read this month. F*ck medium


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: