Senator Durbin Petulantly Promises To Destroy The Open Internet If He Doesn’t Get His Bad ‘Save The Children’ Internet Bill Passed

from the must-we-do-this-again? dept

Last week, we wrote about Senator Dick Durbin going on the floor of the Senate and spreading some absolute nonsense about Section 230 as he continued to push his STOP CSAM Act. His bill has some good ideas mixed in with some absolutely terrible ideas. In particular, the current language of the bill is a direct attack on encryption (though we’re told that there are other versions floating around). The methods by which it does so is in removing Section 230, enabling people to sue websites if they “intentionally, knowingly, recklessly, or negligently” host CSAM or “promote or facilitate” child sexual exploitation.

Now, sure, go after sites that intentionally and knowingly host CSAM. That seems easy enough (and is already illegal under federal law and not blocked by Section 230). But, the fear is that using encryption could be seen as “facilitating” exploitation, and thus the offering of encrypted communications absolutely will be used as plaintiffs to file vexatious lawsuits against websites.

And rather than fixing the bill, Senator Durbin says he’ll push for a full repeal of Section 230 if Congress won’t pass his problematic bill (for what it’s worth, this is the same thing his colleague Lindsey Graham has been pushing for, and it looks like Graham has looped Durbin into this very dumb plan):

If Congress doesn’t approve kids’ online safety legislation, then it should repeal Communications Decency Act Section 230, Senate Judiciary Committee Chairman Dick Durbin, D-Ill., told us last week.

Ranking member Lindsey Graham, R-S.C., is seeking Durbin’s support for legislation… that would repeal Section 230, the tech industry’s shield against liability for hosting third-party content on platforms. Durbin told us he will see what happens with Judiciary-approved legislation on kids’ safety. “If we can’t make the changes with the bills we’ve already passed, 230 has to go,” Durbin said.

Durbin has already made it clear that he does not understand how Section 230 itself works. Last week, on the floor of the Senate, he ranted misleadingly about it while pushing for unanimous consent for STOP CSAM. He starts off with a tearjerker of a story about parents who lost children to terrible people online. But rather than blaming the terrible people, he seems to think that social media companies should wave a magic wand and magically stop bad people:

The emotion I witnessed during that hearing in the faces of survivors, parents, and family members were unforgettable. There were parents who lost their children to that little to the telephone that they were watching day in and day out.

They committed suicide at the instruction of some crazy person on the internet.

There were children there that had grown up into adults still haunted by the images that they shared with some stranger on that little telephone years and years ago.

So, first of all, as I’ve noted before, it is beyond cynical and beyond dangerous to blame someone’s death by suicide on any other person when no one knows for sure the real reason for taking that permanent, drastic step except the person who did it.

But, second, if someone is to blame, it is that “crazy person on the internet.” What Durbin leaves out is the most obvious question: was anything done to that “crazy person on the internet”?

And you think to yourself? Well, why didn’t they step up and say something? If those images are coming up on the Internet? Why don’t they do something about it? Why don’t they go to the social media site? And in many and most instances they did. And nothing happened and that’s a reason why we need this legislation.

So, a few things here: first off, his legislation is about STOP CSAM, yet he was talking about suicide. Those are… different things with different challenges? Second, the details absolutely matter here. If it is about CSAM, or even non-consensual intimate imagery (in most cases), every major platform already has a program to do so.

You can find the pages for Google, Meta, Microsoft and more to remove such content. And there are organizations like StopNCII that are very successful in removing such content as well.

If it’s actual CSAM, that’s already very illegal, and companies will remove it as soon as they find out about it. So Durbin’s claims don’t pass the sniff test, and suggest something else was going on in the situations he’s talking about, not evidence of the need for his legislation.

We say… STOP CSAM Act says, we’ll allow survivors to child online sexual exploitation to sue the tech companies that have knowingly and intentionally facilitated the exploitation.

Again, which platforms are not actually already doing that?

In other words one young woman told the story. She shared an image of herself an embarrassing image of herself that haunted her for decades afterwards. She went to the website. That was that was displaying this and told them this is something I want to take down. It is embarrassment to me. It happened when I was a little girl and still I’m living with it even today. They knew that it was on this website because this young woman and her family proved it, and yet they did nothing, nothing let him continue to play this exploitation over and over again.

Why how to get away with that they asked, and many people asked, I thought we had laws in this country protecting children what’s going on? Well, there’s a Section 230 which basically absolves these companies these media companies from responsibility for what is displayed on their websites on their social media pages. And that’s exactly what we change here.

Again, none of this makes any sense. If the imagery was actually CSAM, then that’s very much illegal and Section 230 has nothing to do with it. Durbin should then be asking why the DOJ isn’t taking action.

From the vague and non-specific description again, it sounds like this wasn’t actually CSAM, but rather simply “embarrassing” content. But “embarrassing” content is not against the law, and thus, this law still wouldn’t make any difference at all, because the content was legal.

So what situation does this law actually solve for? It’s not one involving Section 230 at all.

We say something basic and fundamental. If the social media site knowingly and intentionally continued to display these images, they’re subject to civil liability. They can be sued. Want to change this scene in a hurry? Turn the lawyers loose on them. Let them try to explain why they have no responsibility to that young woman who’s been exploited for decades. That’s what my bill works on. I’m happy to have co-sponsorship with Senator Graham and others. We believe that these bills this package of bill should come to the floor today.

Again, if it’s actually CSAM then it’s a criminal issue and the responsibility is on law enforcement. Why isn’t Durbin asking why law enforcement did nothing? Furthermore, all the major companies will report actual CSAM to NCMEC’s cybertip line, and most, if not all, of them will use some form of Microsoft’s PhotoDNA to identify repeats of the content.

So, if it’s true that this young woman had exploitative imagery being passed around, as Durbin claims, it sounds like either (1) it wasn’t actually illegal, in which case this bill would do nothing, or (2) there was a real failing of law enforcement and/or by NCMEC and PhotoDNA. It’s not at all clear how “turning the lawyers loose” for civil lawsuits fixes anything about that issue.

Again, Durbin seems to wholly misunderstand Section 230, issues related to CSAM, and how modern internet companies work. It’s not even clear from his speech that he understands the various issues. He switches at times from talk of suicide to embarrassing imagery to CSAM, without noting the fairly big differences between them all.

And now he wants to get rid of Section 230 entirely? Why?

The Communications Daily story about Durbin’s plans also has some ridiculous commentary from other senators, including Richard Blumenthal, who never misses an opportunity to be the wrongest senator about the internet.

Passing kids’ online safety legislation is more realistic than obtaining a Section 230 repeal, Senate Privacy Subcommittee Chairman Richard Blumenthal, D-Conn., told us in response to Graham’s plans. Blumenthal introduced the Kids Online Safety Act with Sen. Marsha Blackburn, R-Tenn., …“Passing a repeal of Section 230, which I strongly favor, is far more problematic than passing the Kids Online Safety Act (KOSA), which has almost two-thirds of the Senate sponsoring,” said Blumenthal. “I will support repealing Section 230, but I think the more viable path to protecting children, as a first step, is to pass the Kids Online Safety Act.”

Of course Blumenthal hates 230 and wants it repealed. He’s never understood the internet. This goes all the way back to when he was Attorney General of Connecticut. He thought that he should be able to sue Craigslist for prostitution and blamed Section 230 for not letting him do so.

There are other dumb 230 quotes from others, including Chuck Grassley and Ben Ray Lujan (who is usually better than that), but the dumbest of all goes to Senator Marco Rubio:

Section 230 immunity hinges on the question of how much tech platforms are controlling editorial discretion, Senate Intelligence Committee ranking member Marco Rubio, R-Fla., told us. “Are these people forums or are they exercising editorial controls that would make them publishers?” he said. “I think there are very strong arguments that they’re exercising editorial control.”

I know that a bunch of very silly people are convinced this is how Section 230 works, but it’s the opposite of this. The entire point of Section 230 is that it protects websites from liability for their editorial decision making. That’s it. That’s why 230 was passed. There is no “exercising editorial control” loophole that makes Section 230 not apply because the entire point of the law was to enable websites to feel free to exercise editorial control to create communities they wanted to support.

Rubio should know this, but so should the reporter for Communications Daily, Karl Herchenroeder, who wrote the above paragraph as if it was accurate, rather than completely backwards. Section 230 does not “hinge” on “how much tech platforms are controlling editorial discretion.” It hinges on “is this an interactive computer service or a user of such a service” and “is the content created by someone else.” That’s it. That’s the analysis. Editorial discretion has fuck all to do with it. And we’ve known this for decades. Anyone saying otherwise is ignorant or lying.

In the year 2024, it is beyond ridiculous that so many senators do not understand Section 230 and just keep misrepresenting it, to the point of wishing to repeal it (and with it, the open internet).

Filed Under: , , , , , , ,

Source Link

LEAVE A REPLY

Please enter your comment!
Please enter your name here