Shpongle, when the walls melt.
Shpongle, when the walls melt.
It’s a russian Margolin, or some variant. So yes, a .22LR.
Yep, once anyone can download an app on their phone and do something like this without any effort in realtime it’s going to lose its (shock) value fast. It would be like sketching a crude boobs and vagina on someones photo with MS Paint and trying to use that for blackmail or shaming. It would just seem sad and childish.
Practical, but not really equivalent though because of nil punning.
The locking down started with the original MacIntosh (or actually with the Lisa I guess). ISTR they had at least one bit more open period after that, but those have always been the exception.
Wouldn’t it be more correct to say that most Americans also use a messaging app (iMessage). The rest are just stuck with SMS to have compatibility with the iPhone users.
As the iPhone was (is?) not as popular in the Europe as it was (is) in the States that might also be one of the reasons why people here ditched SMS so fast once smartphones got popular.
Join and recommend smaller general instances like lemm.ee, vlemmy.net, and lemmy.one at random instead. Smaller servers have been upgraded for the surge of users too you know
That was basically my logic when I joined lemmy.world a few weeks ago. Oh well…
That’s actually cool and a bit like what I had in mind. But it doesn’t seem to offer an actual hierarchical view of the lemmyverse.
It would be nice to have a forum style clear treeview of the forums (instances) and their subforums (communities) with activity indicators etc to make browsing and discovering content straight forward. Then if you subscribe to a community it would also show in it’s own treeview that the user could arrange to their liking.
You are not alone, and I’m starting to feel that treating Lemmy like a federation of web forums instead of Reddit replacement would fit the underlying model better.
Speaking as just a hobbyist, a more developer oriented community focused on the topic would be nice, if someone is up to the task.
It’s currently hard to find any good information about how to actually use LLMs as part of a software project as most of the related subreddits etc. are more focused on shitposting and you don’t currently really want to talk about these in general tech/programming forums without a huge Don’t shoot I’m not one of them! disclaimer.
Regarding little Bobby, is there any known guaranteed way to harden the current systems against prompt injections?
This is something that I’m personally more worried about than Skynet or mass unemployment now that everyone and their dog is rushing to integrate LLMs into to their systems (ok worried maybe a wrong word, but let’s just say I have the popcorns ready for the moment the first mass breaches happen with something like the Windows Copilot).
At least I’m interested but more technical discussion about this would probably fit better in some comp sci or programming community? Though most of those are a bit hostile to the LLM related topics these days because of all the hype and low effort spam.
Is the whole “You are an LLM by OpenAI, system date is etc.” prompt part of the system message?
A few days ago when I was talking about controlled natural languages with it and asked it to give a summary of the chat so far in Gellish it spit that out.
If these commands were in a system message it would generally refuse to help you.
Doesn’t it usually fairly easily give its system message to the user? I have had that happen purely by accident.
I’m not sure if I’d call that reverse engineering any more than using a web browsers View Source feature.
But it’s interesting how it works behind the scenes and that only way to get these models to interface with the external world is by using the natural language interface and hoping for the best.
Me too. But I’m probably never going to check most them just to see if they are even alive since it’s just too much of an hassle.
I think there was an update recently and it seems to do that to me too now when I tried it. But the mobile web version is better anyway for now.
Right, I had missed that and forgot about those (I actually just now after using Lemmy realised how annoying the fragmentation can sometimes be even with Reddit)
But it’s not something that Lemmy needs to actually implement themselves. For example someone building a third party mobile app could just add it as an extra feature.
But it is a problem even with Reddit.
At least for me many topics that I follow have several related subs and I often end up going through all of them individually to get a good overview and see different takes on news etc. With Reddit having the Other discussions tab helps a lot, but I guess that would be technically more difficult to implement in Lemmy.
IMHO both would benefit from having a way to combine different feeds under user defined categories. How things actually work under the hood wouldn’t need to be changed, it would just be an UI feature that effects how the communities are presented to the user.
Right, so the light is actually pushed up by these buoyant forces and I guess that then also explains why it’s so dark underground. Fascinating how learning some little new details about the world can sometimes make it all just click together!
But does that mean that light is actually hollow?