• 0 Posts
  • 23 Comments
Joined 2 years ago
cake
Cake day: May 1st, 2024

help-circle
  • That rate of exploitation is pretty wild though, $2/hr while earning hundreds for the employer. Most capitalists begin uncontrollably salivating just thinking about that.

    This is a power thing though, the closest we have/had in terms of rate of exploitation was silicon valley software engineers. They got basically free everything to distract them from how much they were being exploited. If working circumstances were worse, they would have demanded higher pay or quit, because they could afford to.

    As the article notes, in the Philippines that is not the power dynamic at all. These are already among the highest paying jobs, and I doubt these workers are in a position to bargain for better. There are too many people willing to take their job, either in their own country, or in other impoverished countries.




  • But it can be sold as good enough to credulous management, thereby still doing damage by getting people laid off in the short term.

    There’s this famous quote about investing which goes: “the market can remain irrational longer than you can remain solvent”. I think that equally holds for the labor market. Just because you and everyone around you knows your job can’t be replaced by AI, doesn’t mean there won’t be an attempt to replace you which lasts long enough for you to lose your house.


  • I think this is mostly a symptom of the gerontocracy. Most elected officials have not grown up with computers, which is already likely to make them incurious about them. Couple that with being in office so long, likely developing a very high opinion of themselves that they know best. I would guess a significant minority is actively hostile to learning anything about computers, so you can hire any professional to explain stuff with baby talk, it won’t work on them. Combine that with the rest of the technologically illiterate politicians just being indifferent, and you get this kind of policy.














  • I agree that it’s editorialized compared to the very neutral way the survey puts it. That said, I think you also have to take into account how AI has been marketed by the industry.

    They have been claiming AGI is right around the corner pretty much since chatGPT first came to market. It’s often implied (e.g. you’ll be able to replace workers with this) or they are more vague on timeline (e.g. OpenAI saying they believe their research will eventually lead to AGI).

    With that context I think it’s fair to editorialize to this being a dead-end, because even with billions of dollars being poured into this, they won’t be able to deliver AGI on the timeline they are promising.