Or am I the only one remembering this opinion? I felt like it was common for people to say that the internet couldn’t be taken down, or censored or whatever. This has obviously been proven false with the Great Firewall of China, and of Russia’s latest attempts of completely disconnecting from the global internet. Where did this idea come from?

  • Eheran@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    6 days ago

    Probably because of the (originally) decentralized nature. But it is everything but decentralized, pretty much like water infrastructure or streets. So many single points is failure. Sure you could drive 1000 miles through 100 towns, and with only few doing that it will “only” take a lot longer. But route a mayor portion of the traffic through there and that will be the end of that.

    • Electricblush@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      6 days ago

      This is also due to the size of traffic these days.

      Originaly (if we say, take early html as a starting point) it was mostly text, then later a few images.

      These days a simple webpage needs large amounts of code and data just to load. So packets having to get to you in a roundabout way doesn’t just make the page take a little longer to load, it will most likely break the page.

      But the infrastructure and ways of communication is really hard to take down and except for the few nations that have complete control over their own network, it is nearly impossible to break down communication completely. You would just need to rely on simpler data structures.

      As others have stated fewer isp’s and core infrastructure providers do make the global network a bit more vulnerable today. And sites and services that lots of people consider “the internet” can be (at least for a while) taken down/offline.