Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Open source ChatGPT bot platform in Rails 7 by Obie Fernandez and MagmaLabs (obie.medium.com)
7 points by obiefernandez on April 27, 2023 | hide | past | favorite | 5 comments


There are a few really cool things to showcase here:

1) the bots have memory, which means they remember what you discussed before. Pair programming? You don’t have to keep repeating the tech stack and project description

2) scrolling context window. Because bots have memory it’s not as big a deal to just “scroll” the context window down the chat as it gets longer than 4K tokens. Never run out of context

3) universal internationalization. App text is run thru GPT for translation and cached at runtime. Pick whatever language you want from Spanish to mandarin to Klingon to baby talk to pig Latin to emojis.


> Pair programming? You don’t have to keep repeating the tech stack and project description

Dear god, I can't believe how many times I've had to shout this from the rooftops on HN:

DO NOT SEND YOUR CODE TO OPENAI OR ANYONE ELSE.

Anyone working on their employer's code could (and should) be fired for uploading their code to an untrusted cloud, especially one that explicitly tells you not to send it secrets.


Open source much? I guess not.

Also I’m an employer and I don’t give a flying flip if my employees pair with ChatGPT. So maybe just maybe you’re I dunno, wrong?


Open source code can still have local secrets.

You're a single employer. Anecdotes aren't data.

Any company that cares even a little about its users and customers cares more about security than letting its local code directories hit a cloud that warns you not to do it.

So far your counterpoints are: 1) some code isn't secret, and 2) I do it. If I'm wrong, you'd have better points than that.

You're essentially saying you don't care if your family uses seatbelts, so seatbelts aren't necessary. Not everyone runs a clownshoes security operation.


What I'm saying is how about we leave it up to the discretion of the developer to know whether the code their working on is valuable enough to be worthy of security concerns. I personally feel that a lot of code written by mere mortals is not, and that's not simply anecdata; it is informed by nearly 30 years of experience working on software professionally.

This is an especially ridiculous argument to be having in a world where vast numbers of programmers use Copilot day in day out on their corporate codebases, with managerial approval I might add, because productivity.

The fetishization of security for ordinary code is ridiculous imo.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: