Web Dev Person / Ex Performance ECU Calibrations Person

  • 4 Posts
  • 46 Comments
Joined 1 year ago
cake
Cake day: July 3rd, 2023

help-circle
  • That seems like a pretty naive and biased approach to software to me honestly.

    Ease of use, community support, feature set, CI/CD etc…all should come into play when deciding what to use.

    Freedom at all costs is great until you limit the community development and potential user base by 90% by using a completely open repo service that 5% of the population uses or some small discord alternative.

    So then the option is to host on multiple platforms/communities and the management and time investment goes up keeping them in sync and active.

    As with most things in life, it’s best to look at things with nuance rather than a hard stance imo.

    I may stand it up on another service at some point, but also anyone else is totally free to do that as well. There are no restrictions.




  • Thanks!

    Unfortunately currently there isn’t a true RAG implementation largely due to the fact that this site/app is fully self contained with no additional servers or database etc…which is typically required for RAG.

    For now file uploads are stored in the browser’s own local database and the content can be extracted and added to the current conversation context easily.

    I definitely want to add a more full RAG system but it’s a process to say the least, and if I implement it I want it to be quite effective. My experience with RAG generally has left me quite unimpressed with a few quite decent implementations being the exception.




  • This project is entirely web based using Vue 3, it doesn’t use langchain and I haven’t looked into it before honestly but I do see they offer a JS library I could utilize. I’ll definitely be looking into that!

    As a result there is no LLM function calling currently and apps like LM Studio don’t support function calling when hosting models locally from what I remember. It’s definitely on my list to add the ability to retrieve outside data like searching the web and generating a response with the results etc…



  • Local models are indeed already supported! In fact any API (local or otherwise) that uses the OpenAI response format (which is the standard) will work.

    So you can use something like LM Studio to host a model locally and connect to it via the local API it spins up.

    If you want to get crazy…fully local browser models are also supported in Chrome and Edge currently. It will download the selected model fully and load it into the WebGPU of your browser and let you chat. It’s more experimental and takes actual hardware power since you’re fully hosting a model in your browser itself. As seen below.












  • Well you’ll probably really enjoy This Video haha!

    The MGU-H mode was on the “motor” so it doesn’t count for this challenge but still that’s the absolute most I can get out of the car!

    I’m right on the edge of my ability through a lot of this lap, proper sweaty attempt.

    I’ll run in the correct mode for an official lap if I end up needing to, no worries there. At most the advantage was a few tenths as ERS deployment per lap is limited.

    Moar people need to join up and give it a go!



  • Hah, funnily enough Alonso is my all time favorite driver and I have admired his driving style for many years.

    • Yeah Degna 1 is flat if you just close your eyes and huck it in!

    • I don’t think I’ve ever taken Spoon curve in any game or car and thought I did it right, it always feels so wrong to me. I think that attempt was the closest I’ve been to feeling I got it right.

    The 1:29’s are clearly within grasp for both of us, I feel I left a few tenths on the table at the hairpin alone with the lockup and horrible line.


  • Very nice lap! It definitely looks like you tend to be a driver with really smooth and clean inputs, I tend to wrestle things a bit as least in qualifying I’ve noticed in comparison. I’m always fascinated by driving style differences between drivers while also being relatively equal in speed.

    I just happened to have a set a new lap myself, just to give you a bit of a headache 😆