• 0 Posts
  • 18 Comments
Joined 1 year ago
cake
Cake day: June 13th, 2023

help-circle





  • TL;DR The new method still requires his art.

    LoRA is a way to add additional layers to a neural network that effectively allow you to fine tune it’s behaviour. Think of it like a “plugin” or a “mod”

    LoRas require examples of the thing you are targeting. Lots of people in the SD community build them for particular celebrities or art styles by collecting examples of the that celebrity or whatever from online.

    So in this case Greg has asked Stable to remove his artwork which they have done but some third party has created an unofficial LoRA that does use his artwork to mod the functionality back in.

    In the traditional world the rights holder would presumably DMCA the plugin but the lines are much blurrier with LoRA models.



  • You can’t boycott the businesses that aren’t doing their part given that most businesses aren’t doing their part and the ones that are produce stuff that’s more expensive and/or less convenient.

    Supply chains are also super complex these days and even the companies themselves don’t always report on them properly out of either incompetence or simple denial. That’s why every few years we get stories blowing up about tech firms using slave labour to build phones or food corporates ripping off third world farmers.

    Working people are tired and worn down and poor and don’t have the mental capacity or even the capital to be able to micro evaluate every single purchase decision they make and think “hmm does this company or one of is hundreds of suppliers do their part for the climate?”

    For some people it’s “I can afford to feed my kids if I use this cheap product from a company that does bad things or I can go without dinner this week if I only buy from ethical companies”

    Strong top down regulation is the only practical way to make big companies behave.






  • API calls are almost always private between the caller and the endpoint (think telegram bots or mobile apps). There isn’t really a technically feasible way for a crawler to somehow “infer” any kind of knowledge of how api calls are being used unless the result has some kind of publically visible side effect (E. G. The program using the api is generating a web page and uploading it somewhere crawlable). Google et Al go by how many links from other pages to the page of interest exist (inbound links) and multiply by a smattering of other things like quality of keywords, length of content etc.

    That said, if you’re implying that the api changes mean that:

    • people are less likely to use reddit because they can’t access it via RIF/Apollo
    • less useful content is added to the site to be indexed,
    • fewer inbound links will be generated that point to existing posts
    • pages stagnate and drop in ranking

    That is a plausible concern.






  • Hey - I found the same thing WRT the docker files - the compose files from the official project are ever-so-subtly wrong.

    Tagging a docker network as internal blocks outside network comms afaik so the default compose file essentially puts the lemmy server inside its own little sandbox and prevents it from communciating with other servers.

    The solution I found was to add lemmy to both the internal network and the external proxy network:

    
    ## this is what the networks part looks like by default
    networks:                                                                                                                                                   
      # communication to web and clients                                                                                                                        
      lemmyexternalproxy:                                                                                                                                       
      # communication between lemmy services                                                                                                                    
      lemmyinternal:                                                                                                                                            
        driver: bridge                                                                                                                                          
        internal: true            
    
    #... other stuff here
    #lemmy service inside your services: section
      lemmy:
        image: dessalines/lemmy:0.17.3
        hostname: lemmy
        networks:
          - lemmyinternal
          - lemmyexternalproxy # this is the important addition
        restart: always
        environment:
          - RUST_LOG="warn,lemmy_server=info,lemmy_api=info,lemmy_api_common=info,lemmy_api_crud=info,lemmy_apub=info,lemmy_db_schema=info,lemmy_db_views=info,l
    emmy_db_views_actor=info,lemmy_db_views_moderator=info,lemmy_routes=info,lemmy_utils=info,lemmy_websocket=info"
        volumes:
          - ./lemmy.hjson:/config/config.hjson
        depends_on:
          - postgres
          - pictrs
    
    

    Another thing I noticed was that in the documentation they bind nginx on port 80 but the docker-compose provided binds to port 8536 which is the default port that lemmy seems to listen on. I bound 8536 to my host machine and use caddy as a reverse proxy (because it does letsencrypt for you which is nice).

    (Writing to you now from my self-hosted instance which I set up with the above notes)