• 0 Posts
  • 20 Comments
Joined 1 year ago
cake
Cake day: June 11th, 2023

help-circle


  • Yup and I am getting sick of hearing this even on Arch Linux. Like, mofo, you could literally run a snapshot or backup before upgrading, don’t blame us if you’re yoloing your god damn computer. Windows have exactly the same problem too and this is why we have backups. Christ.

    On my Arch Linux Install, I literally have a Pacman Hook that would forcibly run backup and verify the said backup before doing a system-wide update.






  • Yep, and if open source licensing could be revoked on a whim, you can imagine the chaos that ensued. That would be my understanding as well, old version that have MPL license is perfectly fine to fork off, newer version might not be as it is under a different license. One of the reason why I liked Apache License is that it have make it explicitly clear that it’s irrevocable whereas MPL it is operating on an assumption that it’s not revocable. The most fundamental problem with the legal system in USA is that no law is “set in stone” and leaving things to assumption is open to reinterpretation by the judge who may have sided against you. (Hell, Google vs Oracle on Copyrighted API is still on case-to-case basis, so take it as you will.)

    Disclaimer: I am not a lawyer. I just share what I learned from Legal Eagle youtube and few other sources.


  • I concur, there was a few problems that might come up on various platforms like Windows not implementing C11 standard threads and other stuff, you would instead use TinyCThread library that works like a polyfill.

    All problems and challenges are workable, if the problem with Debian is out of date library, you could set up CI/CD for release build that rebuild your software when update occurs and static link the updated dependencies.

    Back to your point, if they didn’t design their code and architecture to be multiplatform like in C, they need to re-evaluate their design decisions.






  • It just rooted back to my frustration when I was trying to fill in missing implementation details on projects like Skia (at the time it lacked support for Vulkan.) My very fundamental core belief is that for core libraries like say, Skia, Neural Net Framework, and other crucial projects like that should offer a way in C API that allows every type and implementation to be extended upon by any other language that can interface with C API by providing your own VTable or whatnot.

    One of the approach I do for my GUI Toolkit written in C (specifically on Linux to replace QT and GTK) was making a single inheritance object oriented programming in C.and if you insert the base class type structure at the top of your custom struct type and provide your own VTable for those objects, you can readily extend the underlying library natively in whatever programming language you use assuming it can talks to C API in a complete sense.

    Let me know if you want a demonstration of this, I would be happy to find the time to set up a small sample to give you the idea on how it’s done.

    And I am also aware of the criticisms on those approach, verbosity of attempting to implement object oriented programming in C is kind of absurd and the API coverage would balloon. That is largely why I work on a Compiler-Generator Framework specifically to address the challenges by allowing me to add dialects on top of C Language such as generic, object oriented programming, and various dialects. I brought C closer to C# in term of syntax and features and at the end of compilation, it still produces readable C language code output and it also generates what I called an FFI-JSON. It’s essentially a JSON file that describes all of the types used in a C project, the sizes of integers/floating points, structure types and it’s fields/offsets/sizes comments, and function declarations. It’s done in a way that you could read the JSON file and generate your programming language binding library saving you weeks of work.




  • I would most likely be using C11 for threads.h and stdatomic.h for foreseeable future, the problem with using the latest and greatest standard is the risk of compiler not supporting it, so I would likely wait at least 5 years before switching to C23 sometime in 2028 or 2029. There was a bit of a controversy around optional bound checking in C11 that they end up removing/deprecating it, I am sure C23 would have something going on.

    I don’t plan on using #embed or constexpr in favor of maintaining common C programming practices, language familiarity is still an important factor to thriving project as much as people nag on me to rewrite everything in Rust or C++.


  • Probably this script:

    #!/bin/bash
    
    if [ -z "$1" ]
    then
            echo "Please provide git repository url as an argument for this script."
            exit 1
    fi
    regex='(https?|ftp|file)://[-[:alnum:]\+&@#/%?=~_|!:,.;]*[-[:alnum:]\+&@#/%=~_|]'
    if [[ $1 =~ $regex ]]
    then
    basename=$(basename $1)
    reponame=${basename%.*}
    curl -X 'POST' 'https://localgitea.com/api/v1/repos/migrate?access_token={Access Token Here}' \
      --insecure \
      -H "accept: application/json" \
      -H "Content-Type: application/json" \
      -d '{  "clone_addr": "'"$1"'",  "issues": false,  "labels": false,  "lfs": false,  "mirror": true,  "mirror_interval": "96h0m0s",  "private": false, "repo_name": "'"$reponame"'", "pull_requests": true,  "releases": true, "repo_owner": "githubpublic",  "service": "git",  "wiki": true}'
    else
            echo "Invalid URL"
            exit 1
    fi
    

    You can adjust it as needed and as for why I have --insecure flag, I have a direct network cable between my PC to the server, so encryption or HTTPS is not needed here. This is probably my favorite command, because I would write above as .sra.sh in home directory and then alias the .bashrc to make a sra command by adding alias sra=/home/{your user account}/.sra.sh in .bashrc and from there, anytime I have an interesting repository that I want to archive, I simply run sra {git url} and that’s it. It also specify the mirror interval manually for 4 days interval rather than every 8 hours that would’ve needlessly spam the git server.

    This is something I rely on everyday both as developer and system admin, I would maintain a different supply chain and prevent a supply chain attacks by generating my own package feeds/registry automatically from Gitea/Forgejo.

    Edited to Add: I noticed this community is Powershell, here the powershell version of above:

    param (
        [Parameter(Mandatory=$true)]
        [string]$gitRepoUrl
    )
    
    function Test-Url($url) {
        $regex = "(https?|ftp|file)://[-[:alnum:]\+&@#/%?=~_|!:,.;]*[-[:alnum:]\+&@#/%=~_|]"
        return $url -match $regex
    }
    
    
    $basename = Split-Path $gitRepoUrl -Leaf
    $reponame = [System.IO.Path]::GetFileNameWithoutExtension($basename)
    
    $headers = @{
        'accept' = 'application/json'
        'Content-Type' = 'application/json'
    }
    
    $body = @{
        'clone_addr' = $gitRepoUrl
        'issues' = $false
        'labels' = $false
        'lfs' = $false
        'mirror' = $true
        'mirror_interval' = '96h0m0s'
        'private' = $false
        'repo_name' = $reponame
        'pull_requests' = $true
        'releases' = $true
        'repo_owner' = 'githubpublic'
        'service' = 'git'
        'wiki' = $true
    } | ConvertTo-Json
    
    Invoke-RestMethod -Uri 'https://localgitea.com/api/v1/repos/migrate?access_token={Access Token Here}' -Method POST -Headers $headers -Body $body -SkipCertificateCheck