I’ll start:

  • RSS and blogs, news vs. social media
  • XMPP vs. WhatsApp/FB messenger/Snapchat
  • IRC vs. Matrix, Teams, Discord etc.
  • Forums vs. Social media, Reddit, Lemmy(?)
  • Hexorg@beehaw.org
    link
    fedilink
    arrow-up
    18
    ·
    1 year ago

    I can’t quite find the blog post but I saw someone do a blog post using AWS’ map reduce on multiple servers to process a dataset… and then they redid their pipeline using bash, awk, and maybe grep and a single 8-core machine did it 100 times or so faster.

    • Omega@beehaw.org
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      I think this is more of a problem of knowing when a specific tool should be used. Probably most people familiar with hadoop are aware of all the overhead it creates. At the same time you hit a point in dataset sizes (I guess even more with “real time” data processing) where it’s not even feasible with a single machine. (at the same time I’m not too knowledgeable about hadoop and bigdata, so anyone else feel free to chime in)

      • Hexorg@beehaw.org
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        1 year ago

        Some context though is that this article was written when cloud computing was all the buzz like crypto just was and AI is now. So many people used cloud just for the buzz and without understanding the tool (same with crypto and AI now)

    • flatbield@beehaw.org
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      I think you can put this under the Linux command line. I.E. the bash shell and the commonly installed Linux command set. Way powerful for certain things.