- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
New development policy: code generated by a large language model or similar technology (e.g. ChatGPT, GitHub Copilot) is presumed to be tainted (i.e. of unclear copyright, not fitting NetBSD’s licensing goals) and cannot be committed to NetBSD.
Ok but how is anyone meant to know if you generated your docstrings using copilot?
How do they know that you wrote it yourself and didn’t just steal it?
This is a rule to protect themselves. If there is ever a case around this, they can push the blame to the person that committed the code for breaking that rule.
This is the only reason rules exist, not to stop people doing a thing but to be able to enforce or defect responsibility when they do.
I mean, generally rules at least are to strongly discourage people from doing a thing, or to lead to things that WOULD prevent people from doing a thing.
A purely conceptual rule by itself would not magically stop someone from doing a thing, but that’s kind of a weird way to think about it.
They’ll use AI to detect it… obviously. ☺️
Are they long, super verbose and often incorrect?
Magic, I guess ?
Because they’ll be shit?
Docstrings based on the method signature and literal contents of a method or class are completely pointless, and that’s all copilot can do. It can’t Intuit anything that docstrings are actually there for.