• 3 Posts
  • 530 Comments
Joined 4 years ago
cake
Cake day: January 17th, 2022

help-circle
  • A lot of already great advice here, often clarifying that a computer that is not yours… is not yours.

    What I would still add though is that you are NOT, and I’m very confident in saying this, the only one there, in your very school, to ask that question. In fact I would argue MOST users have the exact same concerns but they might even be aware that alternatives exist.

    So… do not push back, or even just avoid, all this alone. Find others who have similar problems and solve them together.

    There might be a Linux User Group already, join them. If there isn’t one, consider making it. It might just be you for few weeks, even month, but at least you will dedicate time and space to improve YOUR situation. Chances are though that others, even if only curious at first, might check what you are up to, if they can replicate that, etc.

    Don’t feel isolate, move the needle for yourself first, in your corner, but be welcoming to others who are eager to contribute.

    It’s a challenge, but it’s a fun challenge while trying to tackle it with others.





  • IMHO the question depends on :

    • who you are (boring, rando, political dissident, journalist, etc)
    • who you talk to (family, friends, work, etc)
    • what alternatives actually exist

    So… sure Signal is not perfect but if you can’t convince your family members to move to DeltaChat it sure beats using WhatsApp, Telegram, etc.






  • utopiah@lemmy.mltoPrivacy@lemmy.mlClaude: papers please?
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    10 days ago

    IMHO LLM usage isn’t coherent with independence. That being said I wrote quite a bit on self-hosting LLMs. There are quite a few tools available, like ollama itself relying on llama.cpp that can both work locally and provide an API compatible replacement to cloud services. As you suggested though typically at home one doesn’t have the hardware, GPUs with 100+GB of VRAM, to run the state of the art. There is a middle ground though between full cloud, API key, closed source vs open source at home on low-end hardware : running STOA open models on cloud. It can be done on any cloud but it’s much easier to start with dedicated hardware and tooling, for that HuggingFace is great but there are multiples.

    TL;DR: closed cloud -> models on clouds -> self-hosted provide a better path to independence, including training.









  • Prof Christian Brand, the emeritus professor in transport at Kellogg College,
    

    This guy doesn’t even have a degree.

    I really hope you are a troll lobbying for the car industry … because a rando questioning the credential of an Oxford professor which we can verify with a single DuckDuckGo query reading “Professor Christian Brand is an interdisciplinary environmental scientist, physicist and geographer with over 25 years research experience in academic and consultancy environments.” from the page of one of the most prestigious university in the World, for centuries, is really weird.

    It doesn’t mean though that appeal to authority is right and thus that whatever Prof Christian Brand writes is correct. It’s not because he’s a professor researching in the area of expertise of the paper that he’s right… but his credentials are definitely on point.