My take on copyright “education” via 6 strikes

I haven’t written much here about my  dissertation research, but occasionally there’s a story that expresses well the point I’m attempting to make — law doesn’t work well when it’s misaligned with social norms.

Internet Service Providers (ISPs) have just revealed their new strategy for working with the content industries to crack down on internet piracy (How ISPs will do “six strikes”: Throttled speeds, blocked sites | Ars Technica). The approach takes 3 phases: notice, acknowledgement, and punishment. A spokesperson said that the goal was not to stop heavy-downloading “serial pirates.”

[ISPs goal is to] educate “the vast majority of the people for whom trading in copyrighted material has become a social norm.”

The 3 steps certainly have a better chance of success than the bland, generic copyright “education” of the past, but still might fall short for achieving the kind of change content owners are looking for.

My research is starting to reveal that norms usually trump law, despite “education” efforts. Media effects and psychological theories point to difficulty of overcoming factors like our natural tendency to think heuristically and the power of group norms. I’m sure these measures will bring a small decrease in downloading, but I predict a push-back as our culture engages in more online creativity. Especially if that creativity is an individual’s livelihood!

I’d argue that there are also issues of private enforcement of law here, but that is an issue for another day.

What Scoble is missing on context – we want control, not censorship

A feud is brewing between George Takei (of Star Trek, and more recently Facebook, fame) and tech guru Robert Scoble. At the heart of the disagreement is an issue that could become the free speech issue of our day: control over one’s social networking feed.

Some background

Takei has developed a large following on his Facebook page. Recently, Facebook has instituted changes whereby page owners must pay a fee to ensure that more of their followers see posts. Takei’s forthcoming book devotes an entire chapter to criticizing the system. He wonders, “I am curious as to why interactivity rates on my page appear to fluctuate so much when I have done nothing different.” I agree with Takei that it seems disingenuous for Facebook to start filtering posts (much less charge for higher exposure), after one develops a following on their network.

Scoble is working on a book on “contextual computing,” which is a topic I’ll confess to being excited about.  Context awareness allows computers respond to their surroundings. For example, I’ve configured my smartphone to turn up the volume when I get home but down at night. I’ve configured my computer to change the default printer based on whether I’m at work or at home. It won’t be long before our computers recognize where we are, and what we are doing, and organize our content automatically (good bye folders and tagging!).

The disagreement

Scoble’s response to Takei’s criticism of Facebook was to call for a “war on noise,” and to ask George Takei to “sit down and shut up” (woah!).

With constant feeds of content from Twitter, Instagram, Facebook, etc., we are in a deluge of information (“noise”) that we can’t keep up with. He finds the answer to the problem to be algorhythms like Facebook’s EdgeRank system, which makes statistically-based guesses on what individual users will be interested in. Just when we have gotten used to personal mass communication, this type of “contextual computing” is making editorial decisions about how to disseminate our speech.

The “daily me”

Some years ago, Prof. Cass Sunstein wrote of the “daily me” (first coined by Nicholas Negroponte) in his book Republic.com. It’s described as “a communications package that is personally designed, with each component fully chosen in advance.” The entirety of the book is dedicated to predicting the social ills that might arise from the “daily me,” such as siloing ourselves into separate interest groups that don’t know how to talk to each other. Sunstein finds that the random surprises in uncurated media offer opportunities to connect with ideas that one might otherwise miss.

The rub

I think Scoble is off the mark calling prolific communicators “noise,” and even farther off by calling systems like EdgeRank “context.” To me, “context” is a mechanical response to the moment. It allows machines to take into account what’s going on in the world around us. The “daily me” is curation, not context. Mechanically editing a content stream is not a response to the world around the individual — at best, it’s a misguided attempt to statistically guess interest, and at worst it is blatant censorship.

I have spoken with small business owners and educators who share Takei’s frustration. Most of the people I know find EdgeRank’s choices to be off the mark, and are wondering what has happened to all of their friends that Facebook is hiding. As individual mass communicators we have rightfully developed an expectation to have our voice heard.

We want our contextual computing to help organize and suggest content — to make life more convenient. We don’t need algorhythms to censor what we have to say. The market, and the marketplace of ideas, already does a fine job of it.

Journalist arrested while videotaping police is not guilty

Great story about a journalist taking on intimidation by police:  Jury says journalist arrested while videotaping police is not guilty | Ars Technica.

In a Thursday interview, Miller told us that the prosecution accused him of “being antagonistic to police because I was questioning their orders.” However, he said, “thats what I do. I know my rights. I know the law.”

That little bit of legal knowledge (who knows where he picked it up) deepened his coverage and brought to light a case where the police overstepped their bounds. One could learn the law on their own, but I’d argue that this is one excellent argument for journalism schools.

Misunderstanding the law – the story of illegal MOOCs in Minnesota

Yesterday a big story broke in the arena of state higher ed law: Minnesota bans Coursera: State takes bold stand against free education. A letter was supposedly sent to MOOC provider Coursera, informing them of a Minnesota statute that prohibits institutions from offering education to Minnesota residents without registering with the state:

State law prohibits degree-granting institutions from offering instruction in Minnesota without obtaining permission from the office and paying a registration fee.

Naturally, the idea of a state prohibiting free, online education angered the internets and the story went viral (at least among my education technology peers).

But the story wasn’t over yet. Officials in charge of enforcing the statute have clarified their position, essentially stating that out-of-state MOOCs are beyond their pervue (Slate’s correction: Minnesota Coursera ban: State won’t crack down on free online courses after all).

“I don’t care what they do; we don’t regulate them,” George Roedler, the manager of institutional registration and licensing at the Minnesota Office of Higher Education, told Ars on Thursday.

For me, the interest in the story is in the changing story of the law. Research shows that, once misinformation has been spread, it’s very difficult to correct these misperceptions (a short description of the phenomenon). When that error is related to law, it might be more likely to impact behavior — either towards compliance, or driving people to rebel against it. Except for those who might follow a story closely, it is likely that any corrections might be missed.

This is why it’s important for journalists to be careful about reporting that “the law says ___.” The law is open to interpretation and change. Some care in called for in using language like “crack down” — especially given how easy it is to inflame social media users to quickly spread a rumor.