The Download: Listening robots, and Google’s AI emissions


We all want to be able to speak our minds online—to be heard by our friends and talk (back) to our opponents. At the same time, we don’t want to be exposed to unpleasant speech.

Technology companies address this conundrum by setting standards for free speech, a practice protected under federal law, hiring in-house moderators to examine individual pieces of content and removing them if posts violate predefined rules.

The approach clearly has problems: harassment, misinformation about topics like public health, and false descriptions of legitimate elections run rampant. But even if content moderation were implemented perfectly, it would still miss a whole host of issues. We need a new strategy: treat social media companies as potential polluters of the social fabric, and directly measure and mitigate the effects their choices have on us. Read the full story.

—Nathaniel Lubin & Thomas Krendl Gilbert

We can still have nice things

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or tweet ’em at me.)

+ Salads don’t have to be boring—especially when there’s boiled eggs and salmon involved.
+ Whac-a-mole has a surprisingly long and checkered past.
+ Here’s how to rehydrate quickly if you’re feeling the heat.
+ Phones are crazy expensive. But you can make them last longer if you’re smart about it.



Latest articles

spot_imgspot_img

Related articles

Leave a reply

Please enter your comment!
Please enter your name here

spot_imgspot_img