Hey guys, I am a NextJs Dev. I want to make and anime streaming website. Ofcourse I am totally broke. I can’t afford a database. Even if I could how does that work anyways? I mean, do I have to upload every single anime?? That sounds like a lot of work. How do sites like zoro or 9anime deal with it? Do they have their own db? I thought about webscrapping but I am not sure how it’s done, if anybody could explain in detail I would be grateful. I am trying to understand how those stuffs really work. Also, I know an api called Consumet but I want to be able to make my own! I am working on a french anime website so I need to figure out how I can get the animes.

My previous website using Consumet API (still works check it out :)) https://poketv.vercel.app/

  • Nemila@lemmy.dbzer0.comOP
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    Yeah you are right! I honestly just want to understand how it works. I need to know that I have the skills to make it. I am looking for those skills. What you said made me question my skills as a full stack tho 😂 I am gonna have to learn more stuffs about backend.

    Now, taking in account the fact that I am not trying to make money out of it of build a website that can replace Zoro or 9anime. Suppose, I am trying to build that website just for me and a few friends. You mentioned scraping, should I use a language like python or puppeter (nodejs). Or a software or tool that can do it?

    • GiganticPrawn@beehaw.org
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 years ago

      You should look in to Usenet + Indexers + Sonarr + Plex.

      There are quite a few different guides on the subject. Basically, you would use Usenet as the backend source. Sonarr would be the application to grab all the files and automate downloading. Plex would be used for sharing. It’s a whole rabbit hole.

    • GerminatingSeed@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 years ago

      Language is mostly a concern that newbies have. Experienced developers will often say it doesn’t matter much, so just use whatever suits you.
      There’s all sorts of tools/libraries out there that can assist with scraping, or you may decide to roll everything on your own. Every developer will have their own preferences, so find what works for you.

      A big part of this would depend on what you’re aiming to get out of the experience. If it’s just for personal education, you might want to take an approach that’s more conducive to learning what you want to know. On the other hand, if you’re more interested in completing a project, you might focus less on learning and more on getting the project done.

      If you’re aiming for a private site, it removes a lot of concerns you need to have with a “real” (public) site, simplifying things greatly.