Saturday, May 4All That Matters

Artificial Intelligence: Last Week Tonight with John Oliver (HBO)


Artificial Intelligence: Last Week Tonight with John Oliver (HBO)




View Reddit by BoogsterSU2View Source

17 Comments

  • turtlevenom

    He’s just so good at what he does. Thank you and your entire team, John, for always fighting the good fight, and making our sides split along the way.

  • PPQue6

    I’m so glad he brought up the point about AI being used on resume scanners. This has been one of the most under reported issues in the job market that many people haven’t been talking about. Before you would tailor your resume for a company so you could show case who you are and what you can bring to the table. However, if you try to send out your resume like that these days you’re not going to get any interviews. Now what you have to basically do is use a format that can be read by Applicant Tracking Systems (ATS, ie. resume scanners) and use nothing but keywords just so that you can make it through that first hurdle.

    Also since he touched on what the EU is doing with their regulations, it makes me hope that our lawmakers here are considering doing the same thing, though I very much have my doubts…

  • MissDiem

    Glad the overblown hype about this latest overestimated fad may be beginning to break.

    We all know Reddit has become a hostile hivemind. To see this in action, dare to post the fact that Chernobyl has highly factionalized and watch whar happens.

    Far as I can tell, the chat bots currently being billed as “AI” are essentially supercomputer versions of Reddit moderators, there to assimilate and hivemind opinion whether it’s true or not, mass-amplify that “one true’ (but actually false) opinion, and then aggressively censor anyone who disagrees by burying them with an unreal amount of false sources.

  • sam__izdat

    While skepticism is good, the problem with liberal skepticism about machine learning is that it’s both too anemic and somewhat misdirected. It’s true that these systems are deficient in myriad ways. It’s also true that they are going to spew out all the social biases they were trained on. But…

    1. It’s a mistake to anthropomorphize and confuse features for bugs. It’s all A and no I. A large language model will predict the most plausible next word in a sentence. That’s literally all it does. It is a giant conveyor belt shitting out plausible — not *true* — output by aligning the biases of a huge pile of linear algebra to your own biases, in the non-technical sense of the word. That *is* the technology. That’s the whole point. And a lot of it is almost pure grift awash in capital, like self-driving cars. It’s not a technology in development. It’s a scam for the neoliberal wrecking balls to continue pilfering what could otherwise go toward public transit and public infrastructure to get cars off the roads.

    2. What should be far more terrifying than the question “what if the computer doesn’t do what they want it to?” is the question “what if it does?” With the power systems in place, when you weed out the marketing bullshit, the most practical applications of machine learning include surveillance capitalism, disinformation, manipulation, class control and digital Taylorism. You don’t live in a world where the people holding the purse strings will try to make your life better. You live in one where they’re going to use “AI” to optimize the piss breaks of the people packing your amazon shipments.

    Also, that medical record, which was for some reason on a public-facing server and begging to be indexed by well-behaved crawlers, is not going be memorized by a diffusion model plowing through five billion images. That’s not how that works.

    Also also, getty images is a [bunch of parasites](https://en.wikipedia.org/wiki/Getty_Images#Claiming_copyright_over_public_domain_content) and a generative model pumping out watermarks doesn’t mean that you’re looking at a reproduction of a copyrighted image. It doesn’t know the difference between a beach and a watermark. If you train it on a ton of images where landscapes appear behind watermarks, it’ll just learn that a watermark is a part of nature, like rocks and sand dunes, and will do its best to faithfully reproduce this prominent landmark where it seems to be most appropriate. It’s not copy-pasting a stock photo. It’s trying to produce a convincing picture, and it “thinks” copyright troll watermarks are one of the ingredients.

  • __Rick_Sanchez__

    First John Oliver episode that doesn’t tell or reveal anything new about the topic for me and I’m not even following AI that closely.

  • OsakaWilson

    The statement that “AI will not take your job, a person who works with AI will take your job” is only temporarily true.In the end, AI will create as many jobs for people as cars did for horses.*

    *Not my original line.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.