What happens when a salaried YouTuber goes solo: the Daily Grace story

Grace Helbig

You might have heard of Daily Grace, or Grace Helbig. She’s a 28-year-old actress-comedian who uploads videos on YouTube Monday to Friday. DailyGrace has 2 million+ subscribers and 227 million video views, and Forbes listed Grace on their 30 Under 30 Hollywood & Entertainment list for 2014 along with Rebel Wilson, Jennifer Lawrence, Kelly Osbourne and Anna Kendrick saying “Helbig is one of the sharpest, funniest voices on YouTube”.

Daily Grace died on December 31 2013.

Not the person, it’s just DailyGrace isn’t Grace’s channel anymore and since the start of 2014 no new content has been uploaded. The videos being uploaded Monday to Friday on that channel are reruns (first reruns on YouTube?) and presumably Grace isn’t receiving any of the ad revenue from them. Until recently, Grace had a contract with a company called My Damn Channel, who are going through an identity crisis and rebranding as Omnivision Entertainment. She made videos on the YouTube channel DailyGrace and they paid her a salary and maybe a commission based on YouTube views.

“Grace leaving Daily Grace is kinda like a Pokemon evolving. You’re sad because you liked how cute it looked before, but you’re also excited because it can shoot lasers out of its eyes now.” –killmeeko

After five years, Grace and My Damn Channel have chosen to part ways which, as VideoInk says, is probably the hardest decision Grace has made in her career. My Damn Channel owns the content and intellectual property Grace created while in their employment, including the YouTube channel DailyGrace, 2 million+ subscribers, themed days (Sexy Friday etc.), catch phrases (you’ve been hazed, new viewser alert…), and Facebook page–her Tumblr and Twitter are still hers, presumably because they aren’t under the Daily Grace brand.

How do you deal with suddenly not being able to use any of the intellectual property you came up with? Compare a 2013 ‘commenting on your comments’ video with a 2014 one:

“Here’s the lesson: Many corporations think that by owning YouTube channels, they’ll have something valuable. But the value is not in the channel or in the number of subscribers. On YouTube, despite the corporatization of everything, the value is in people.” –Tim Helbig

The brand that My Damn Channel is asserting ownership over is effectively a person. People subscribed to DailyGrace for Grace, and have been steadily unsubscribing because of the new content drought and My Damn Channel/Grace drama. Grace is continuing to upload videos daily on her used-to-be-second-but-is-now-main channel ItsGrace, something she wasn’t allowed to talk about while she was still in charge of the DailyGrace accounts. Viewers were left with a cryptic goodbye on December 27 where Grace said she would be back making videos from January 6 after a break. She couldn’t say that these new videos wouldn’t be on the DailyGrace channel.

Is it fair enough that My Damn Channel is enforcing their rights under a mutually agreed contract which Grace would have either received legal advice over or had the opportunity to seek legal advice over? Probably. An arrangement that guaranteed an income for making YouTube videos would have looked pretty great five years ago, but as time goes on you’d start to realise that perhaps you could be earning more without the middleman taking a cut… and for doing what exactly? My Damn Channel is a business and they’ll want to get all the ad revenue they can from the old DailyGrace videos which they’re rerunning on YouTube. Grace is going independent, at least for the time being, and will have full ownership over the content she creates from now on. And at least 1.7 million subscribers have found their way to ItsGrace.

The sad thing is that some fans might never find Grace’s new channel (My Damn Channel hasn’t changed the about page for DailyGrace from “I vlog everyday! Five days a week!”, except for the removal of her social media links and stripping the themed days from the header image), Grace was faced with rebuilding her subscriber base from the 100,000 she had on her second channel, and that the day has come where My Damn Channel is exercising the control they have over a whole vault of content Grace made in an intimate setting–inside her home–by reuploading it in an attempt to keep up the appearance that Daily Grace is still alive.

But Grace still has herself, and maybe that’s all the matters.

“DailyGrace is Grace Helbig, which is me. DailyGrace [the channel] was a concept owned by My Damn Channel, but Grace Helbig is my personality, owned by myself…so that’s what I’m moving forward with and that’s what, to me, is priceless.” –Grace Helbig

Image credit: Grace Helbig

I Know What You Downloaded Last Summer

YouHaveDownloaded.com
I'm a good boy.

YouHaveDownloaded.com

An interesting site popped up near the end of last year called YouHaveDownloaded.com. You might not have visited it, or even heard of it, but if you’ve been using torrents, it might have heard of you.

The site is quite simple, it tracks torrents and the people (IP addresses) downloading them, much like copyright holders do (or hire companies to do for them). They claim to be tracking roughly 4%-6% of all torrent downloads and 20% of torrents from public trackers, like The Pirate Bay.

The difference to the copyright holders is that this site makes the information is collects public. You can see what it thinks the IP address you’re using has been used to torrent, or any other IP address you can think of. It might not be right, or it might be spot on.

This site just highlights what is going on all the time. Torrenting is a very public activity unless you’re making an effort to protect your privacy (like using a proxy or VPN from a reputable provider). Privacy is not the default on the interwebs.

IP addresses are more like PO Boxes than physical addresses — most people have dynamic IP addresses that regularly change, and add in the fact that some people have insecure Wi-Fi, the results on the site aren’t that accurate.

The site brings up an interesting statistic, especially if it’s true: “About 10% of all online shoppers, in the US, are torrent users as well.” In the future will advertisers link an IPs torrenting history to an advertising profile. Is this already happening?

The removal form

The site provides a form that supposedly enables people to request removal from the site. Don’t use it.

Previously it asked people to sign in using their Facebook accounts, and the CAPTCHA to get to the non-Facebook removal form didn’t work (ie. they wanted to link your data with a real name, cue warning bells). Now it seems like Facebook has revoked their access to use Facebook logins (they say Facebook logins are “Temporarily disabled due problems with Facebook”), so it brings up the removal form, which asks for a name and an email address.

I’m not saying this is what the people behind the site are doing, but this would be all the information they would need, in addition to the information they have on torrents associated with your IP address, to send an extortionate email your way. Or sell your data (probably not to copyright holders, because they hire people to do this for them already).

Here’s what their removal terms are (and yeah, the rest of the site is worded like this too):

Removal Terms
The Details: By submitting a request to have your download activity removed from our database, you are acknowledging that the activity was, in fact, carried out by yourself. This means that you are only submitting a request to have the details of your own personal activity deleted. Any unrecognized activity, such as files you did not download or do not remember downloading, are not — I repeat, are not to be included in your removal request. Why is this imperative? Well, we actually don’t have to explain ourselves…sorry.

The important part is that you understand these terms and conditions before hitting that beautiful button that will erase your criminal back ground, at least for now. Wait, you did remember to read these terms before making the decision to submit a removal request, right? Of course you did, everyone reads the fine print.

Other Important Things to Consider: We make no guarantees that your information will not appear on any other databases. We may have erased your bad behavior but, keep in mind that your data on this site is aggregated public domain. So, if by chance, another sadistic group of people decides to open a similar web site, we have no control over what they do with your information. Furthermore, if you continue to involve yourself in activity like this, your future download history will, without a doubt, appear in our database again and we may not be as nice about it next time.

If any part of these terms is still unclear, please visit your local elementary school and ask to repeat grades 3 through 5.”

Giving the people or company behind the site any more information about yourself is not a good idea, even if they claim that the site is a joke and you shouldn’t take it seriously.

And anyway, if your IP address is listed on the site, it must be because of the person that used it previously. Right?

The only redeeming feature of the site? You can look up the content companies that take people to court for illegal file sharing.

Three Strikes Law Shifted File Sharing From Torrents To Tunnels

Cables

Shifting file sharing

A survey commissioned by the MPAA and friends last year stated that seven out of 10 people surveyed said that they would stop illegally sharing files after they received one notice from a copyright holder under the three strikes scheme.

Perhaps they should have also asked how many people would just change how they download files illegally?

The WAND Network Research Group at The University of Waikato has been measuring how traffic flows through a New Zealand ISP. They can split traffic into types with a pretty high degree of accuracy without having to “look inside” too much. Donald Clark compares it to looking at the postmark of a package and giving it a squeeze and being able to tell, in general terms, what’s inside, without having to open it.

Here’s a graph (ht Tech Liberty/1through8) showing the change in traffic volume in September 2011 and January 2012 by type relative to January 2011. In January 2011 the Copyright (Infringing File Sharing) Amendment Act (the three strikes Skynet law) wasn’t in force. On September 1 2011 copyright holders could start sending notices to IPAPs, and around that time there was a strong media interest in the law. January 2012 is a few months later.

The resulting data is a valuable insight into how residential DSL customers at this particular ISP reacted to the new law.

WAND Three Strikes ISP data

More graphical goodness can be found in the slides from a NZNOG presentation here.

There was about a 75% decrease in BitTorrent traffic straight after the law was introduced, largely sustained into 2012, with huge increases in remote and tunneling traffic. The law isn’t stopping file sharing, just moving it underground, using VPNs, seedboxes and sites like now closed Megaupload.

There was also a big decrease in newgroup traffic, even though it doesn’t appear to be targeted by the new law.

Here’s what the project leader, Shane Alcock said:

“P2P, P2P structure, Unknown, Newsgroups and Encrypted [not all shown in the graph above] have all decreased massively from their January 2011 levels. Interestingly, each of these categories can be tied to the illegal downloading activities targeted by the CAA [Copyright Amendment Act]. P2P and P2P structure are obviously related, Newsgroups are a common source of torrent files and the Unknown and Encrypted categories were strongly suspected of containing a significant quantity of encrypted P2P traffic.

Even more interestingly, Remote, Tunneling and Files experienced similarly large growths in the amount of traffic downloaded by DSL users. This is probably indicative of people changing their approach to downloading copyrighted material. Instead of participating in file sharing on their home machines, it has become more common for people to use machines based in other countries and ship the file back home via another protocol. This might be via SSH, VPN or FTP, for example, which are all covered by the growing categories.

Similar trends are observed when looking at traffic transmitted by the DSL users. Categories associated with P2P file sharing have seen much less traffic compared with January 2011, whereas Tunneling, Remote and Files have soared.

It should be noted that although Tunneling has grown significantly, the overall amount of Tunneling traffic is still much less than the total amount of P2P traffic. But the sudden changes in application protocol usage are still very noteworthy and suggest that the CAA has had a major impact on people’s Internet usage.”

Image credit: technoloic

“Where would your government be without child porn?”

If it didn’t exist, the government would surely invent it.

Because it’s a great excuse for an internet censorship machine.

This isn’t a debate about whether child sex abuse is right or wrong. You know it’s wrong, I know it’s wrong, we all know it’s wrong. This is a debate about censorship.

Censorship causes blindness

New Zealand has an internet blacklist. A list of content that, if your internet service provider has decided to be part of the filtering project, you can’t access. Images of child sexual abuse are meant to be the only stuff blocked, but the list is secret, censorship decisions happen in private and if international experience is anything to go by, other content has a habit of turning up blacklisted.

What the filter is

Its full name is the Digital Child Exploitation Filtering System. It’s run by the Department of Internal Affairs. It’s powered by NetClean’s WhiteBox, which was supplied by Watchdog Internationalwhich provides filtered Internet access for families, schools and businesses”.

The DIA say that they’re contractually constrained to only use the filter to block child sexual abuse material.

They say that:

“The filtering system is also a tool to raise the public’s awareness of this type of offending and the harm caused to victims. The Group agreed that this particular aspect of the filter needs to be more clearly conveyed to the public.”

So basically, it’s to make it seem like they’re doing something, because it doesn’t actually prevent people from accessing child sex abuse images.

The list is maintained by three people (pdf) (mirror), and sometimes there is a backlog of sites to investigate: “The Group was advised that the filter list comprises approximately 500 websites, with several thousand more yet to be examined.”

How it works

A list of objectionable sites is maintained by the Department. If someone using an ISP that’s participating in the filter tries to access an IP address on the filter list, they’ll be directed to the Department’s system. The full URL will then be checked against the filtering list. If the URL has been filtered, users end up at this page. The user can appeal for the site to be unfiltered, but no appeals have been successful yet (and some of the things people have typed into the appeal form are actually quite disturbing).

Is my internet being filtered?

The internet of 2.2 million ISP clients is being filtered.

It’s voluntary for ISPs to participate in because it wasn’t introduced through legislation, however big ISPs are participating:

  • Telecom
  • TelstraClear
  • Vodafone
  • 2degrees

Others are:

  • Airnet
  • Maxnet
  • Watchdog
  • Xtreme Networks

I assume, for the ISPs providing a mobile data service, the filter is being applied there too.

Why the filter is stupid

Child pornography is not something someone stumbles upon on the internet. Ask anyone who has used the internet whether they have innocently stumbled upon it. They won’t have.

It’s easy to get around. The filter doesn’t target protocols other than HTTP. Email, P2P, newsgroups, FTP, IRC, instant messaging and basic HTTPS encryption all go straight past the filter, regardless of content. Here’s NetClean’s brochure on WhiteBox (pdf), and another (pdf). Slightly more technical, but still basic tools like TOR also punch holes in the filter. The filter is not stopping anyone who actually wants to view this kind of material.

A much more effective use of time and money is to try to get the sites removed from the internet, or you know, track down the people sharing the material. Attempts to remove child sex abuse material from web hosts will be supported by a large majority of hosts and overseas law enforcement offices.

It is clear that the DIA don’t do this regularly. They’re more concerned with creating a list of URLs.

From the Independent Reference Group’s December 2011 report:

“Additionally 18% of the users originated from search engines such as google images.”

Google would take down child sex abuse images from search results extremely fast if they were made aware of them. And it is actually extremely irresponsible for the DIA not to report those images to Google.

Update: The DIA say they used Google Images as an example, and that they do let Google know about content they are linking to.

“The CleanFeed [the DIA uses NetClean, not Cleanfeed] design is intended to be extremely precise in what it blocks, but to keep costs under control this has been achieved by treating some traffic specially. This special treatment can be detected by end users and this means that the system can be used as an oracle to efficiently locate illegal websites. This runs counter to its high level policy objectives.” Richard Clayton, Failures in a Hybrid Content Blocking System (pdf).

It might be possible to use the filter to determine a list of blocked sites, thus making the filter a directory or oracle for child sex content (however, it’s unlikely people interested in this sort of content actually need a list). Theoretically one could scan IP addresses of a web hosting service with a reputation for hosting illegal material (the IWF have said that 25% of all websites on their list are located in Russia, so a Russian web host could be a good try). Responses from that scan could give up IP addresses being intercepted by the filter. Using a reverse lookup directory, domain names could be discovered that are being directed through the filter. However, a domain doesn’t have to contain only offending content to be sent through the DIA’s system. Work may be needed to drill down to the actual offending content on the site. But this would substantially reduce the effort of locating offending content.

Child sex abuse sites could identify DIA access to sites and provide innocuous images to the DIA and child sex abuse images to everyone else. It is possible that this approach is already happening overseas. The Internet Watch Foundation who run the UK’s list say in their 2010 annual report that “88.7%­ of all­ reports­ allegedly­ concerned­ child­ sexual abuse­ content­ and­ 34.4%­ were­ confirmed­ as such­ by­ our­ analysts”.

Someone could just use an ISP not participating in the filter. However people searching for this content likely know they can be traced and will likely be using proxies etc. anyway. Using proxies means they could access filtered sites through an ISP participating in the filter as well.

It is hard (practically, and mentally) for three people to keep on top of child sex abuse sites that, one would assume, change locations at a frequent pace, while, apparently, reviewing every site on the list monthly.

The filter system becomes a single point of attack for people with bad intentions.

The DIA, in their January 2010 Code of Practice (pdf) even admit that:

  • “The system also will not remove illegal content from its location on the Internet, nor prosecute the creators or intentional consumers of this material.” and that
  • “The risk of inadvertent exposure to child sexual abuse images is low.”

Anonymity

The Code of Practice says:

“6.1          During the course of the filtering process the filtering system will log data related to the website requested, the identity of the ISP that the request was directed from, and the requester’s IP address.
6.2          The system will anonymise the IP address of each person requesting a website on the filtering list and no information enabling the identification of an individual will be stored.”

“6.5          Data shall not be used in support of any investigation or enforcement activity undertaken by the Department.” and that

“5.4          The process for the submission of an appeal shall:
•    be expressed and presented in clear and conspicuous manner;
•    ensure the privacy of the requester is maintained by allowing an appeal to be lodged anonymously.”

Anonymity seems to be a pretty key message throughout the Code of Practice.

However…

In response to an Official Information Act request, the DIA said:

“When a request to access a website on the filtering list is blocked the system retains the IP address of the computer from which the request originated. This information is retained for up to 30 days for system maintenance releases and then deleted.” [emphasis mine]

Update: The DIA says that the IP address is changed to 0.0.0.0 by the system.

The site that people are directed to when they try to access a URL on the blacklist (http://dce.net.nz) is using Google Analytics. The DIA talk the talk about the privacy and anonymity around the filter, but they don’t walk the walk by sending information about New Zealand internet users to Google in the United States. It’s possible this is how the DIA gets the data on device type etc. that they use in their reports. Because anyone can simply visit the site (like me, just now) those statistics wouldn’t be accurate.

DCE filter Google Analytics

From the Independent Reference Group’s August 2011 (pdf) minutes:

“Andrew Bowater asked whether the Censorship Compliance Unit can identify whether a person who is being prosecuted has been blocked by the filtering system. Using the hash value of the filtering system’s blocking page, Inspectors of Publications now check seized computers to see if it has been blocked by the filtering system. The Department has yet to come across an offender that has been blocked by the filter.”

I’m not exactly sure what they mean by hash value, but this would seem to violate the “no information enabling the identification of an individual will be stored” principle.

Update: They are searching for the fingerprint of content displayed by the blocking page. It doesn’t seem like they could match up specific URL requests, just that the computer had visited the blocking page.

And, from the Independent Reference Group’s April 2011 (pdf) minutes:

“For all 4 of the appeals the complainant did not record the URL. This required a search of the logs be carried out to ensure that the site was correctly being blocked.”

Appeals are clearly not anonymous if they can be matched up with sites appellants have attempted to access.

Update: The reviewers look at the URLs blocked shortly before and after the appeal request to work out the URL if it isn’t provided.

9000 URLs!

The DIA earlier reported that there were 7000+ URLs on their blacklist. This dropped to 507 in April 2011, 682 in August 2011, and 415 in December 2011. Those numbers are much closer to the 500 or so URLs on IWF’s blacklist.

Where did these 6500 URLs disappear to (or more accurately, why did they disappear?). What was being erroneously blocked during the trial period, or was 7000 just a nice number to throw around to exaggerate the likelihood of coming across child sex abuse images (though, even with 7k sites, the likelihood still would have been tiny)?

Scope creep

Firstly, we weren’t going to have a filter at all:

‘“We have been following the internet filtering debate in Australia but have no plans to introduce something similar here,” says Communications and IT minister Steven Joyce.

“The technology for internet filtering causes delays for all internet users. And unfortunately those who are determined to get around any filter will find a way to do so. Our view is that educating kids and parents about being safe on the internet is the best way of tackling the problem.”’

Then it was said that:

“The filter will focus solely on websites offering clearly illegal, objectionable images of child sexual abuse.”

and

Keith Manch said the filtering list will not cover e-mail, file sharing or borderline material.” [emphasis mine]

One would assume from “images of child sexual abuse” that they would be, you know, images of children being sexually abused. However, it seems that CGI and drawings (Hentai) have made the list.

From the minutes of the Independent Reference Group’s October 2010 meeting:

“Aware that the inclusion of drawings or computer generated images of child sexual abuse may be considered controversial, officials advised that there are 30 such websites on the filtering list [that number is now higher, 82 as of December 2011]. Nic McCully advised that officials had submitted computer generated images for classification and she considered that only objectionable images were being filtered.”

The arguments around re-victimization kind of fall apart when you’re talking about a drawing.

And from the borderline material file:

“The Group was asked to look at a child model website in Russia. The young girl featured on the site appears in a series of 43 photo galleries that can be viewed for free. Apparently the series started when the girl was approximately 9 years old, with the latest photographs showing her at about 12 years old. The members’ part of the site contains more explicit photos and the ability to make specific requests. While the front page of the website is not objectionable, the Group agreed that the whole purpose of the site is to exploit a child and the site can be added to the filter list.”

Clearly illegal, objectionable images of child sexual abuse? No, but we think it should be filtered so we went and did that.

Dodgy DIA

The DIA was secretive about the filter being introduced in the first place. Their first press release about it was two years after a trial of the system started. I wonder how many of those customers using an ISP participating in the trial knew their internet was being filtered during that time?

The Independent Reference Group is more interesting than independent. Steve O’Brien is a member of the group. He’s the manager of the Censorship Compliance Unit. To illustrate this huge conflict of interest, he is the one who replies to Official Information Act requests about the filter. Because the Censorship Compliance Unit operate it.

The Group was advised that the issue of Steve O’Brien’s membership had been raised in correspondence with the Minister and the Department. Steve O’Brien offered to step down if that was the wish of the Group and offered to leave the room to allow a discussion of the matter. The Group agreed that Steve O’Brien’s continued membership makes sense.” [emphasis mine]

That was the only explanation given. That it makes sense that he is a member. Of the group that is meant to be independent.

Additionally, the DIA seems to have accidentally deleted some reports that they should have been keeping.

From Tech Liberty:

“Last year we used the Official Information Act to ask for copies of the reports that the inspectors [have] used to justify banning the websites on the list. The DIA refused. After we appealed this refusal to the Ombudsman, the DIA then said that those records had been deleted and therefore it was impossible for them to give them to us anyway. The Department has an obligation under the Public Records Act to keep such information.

We complained to the Chief Archivist, who investigated and confirmed that the DIA had deleted public records without permission. He told us that the DIA has promised to do better in the future, but naturally this didn’t help us access the missing records.”

List review

The Code of Practice says:

“4.3    The list will be reviewed monthly, to ensure that it is up to date and that the possibility of false positives is removed. Inspectors of Publications will examine each site to ensure that it continues to meet the criteria for inclusion on the filtering list.”

It’s unlikely this actually happens.

Here’s some statistics of how many URLs have been removed.

December 2011
267 removed

August 2011
0 removed

April 2011
108 removed

It’s impossible that between April and August there were no URLs to remove.

In the Independent Reference Group’s December 2011 report it seemed like the following was included because it happens so rarely:

“The list has been completely reviewed and sites that are no longer accessible or applicable (due to the removal of Child Exploitation Material) have been removed.”

The Independent Reference Group has the power to review sites themselves. But in at least one case, they chose not to:

“Members of the Group were invited to identify any website that they wish to review. They declined to do so at this stage.”

 

The filter isn’t covered by existing law and didn’t pass through Parliament. Appropriate checks and balances have not taken place. The DIA did this on their own.

By law, the Classification Office has to publish its decisions, which they do. The DIA’s filter isn’t covered under any law, and they refuse to release their list. The DIA say that people could use the list to commit crimes, but the people looking for this material will have already found it.

What if the purpose of the filter changes? The DIA introduced it without a law change, the DIA can change it without a law change. What if they say “if ISPs don’t like it, they can opt out of the filter”? How many ISPs will quit?

The only positive is that the filter is opt in for ISPs. Please support the ISPs that aren’t using the filter. Support them when they’re accused of condoning child pornography, and support them when someone in government decides that the filter should be compulsory for all ISPs.

 

Side note: why does all of the software on the DIA’s family protection list, bar one, cost money? There is some excellent, or arguably better, free software available. There’s even a free version of SiteAdvisor, but the DIA link to the paid one. Keep in mind that spying on your kids is creepy. Talk to them, don’t spy. The video for Norton Online Family hilariously and ironically goes from saying “This collaborative approach makes more sense than simply spying on your child’s internet habits [sitting down and talking — which is absolutely correct]” to talking about tracking web sites visited, search history, social networking profiles, chat conversations and then how they can email you all about them. Seriously. Stay away.

Image credit: Andréia Bohner

Congratulations Internets

But your work is not over.

Wikipedia SOPA PIPA Blackout Protest

On January 18 the users and companies of the internet rallied together to protest against SOPA and PIPA, bills that would censor the internet. Check out the numbers. It worked. Here‘s part of a huge list, with even bigger names on it of the sites that participated in the blackout. Google, Wikipedia, Reddit, BoingBoing and Wired are among them. Here’s the page Wikipedia displayed. The Wikipedia page about SOPA and PIPA was accessed more than 162 million times during the 24 hours the site was blacked out. More than eight million people looked up their elected representatives’ contact information via Wikipedia’s tool, crashing the Senate’s website. At one point, 1% of all tweets on Twitter included the #wikipediablackout hashtag.

SOPA? PIPA?

Is it over?

It is likely the bills will be back in one form or another:

What’s the best way for me to help? (for U.S. citizens)

The most effective action you can take is to call your representatives [phone calls have the most impact] in both houses of Congress, and tell them you oppose SOPA, PIPA, and the thinking behind them.[9]

What’s the best way for me to help? (for non-U.S. citizens)

Contact your country’s Ministry of Foreign Affairs or similar government agency. Tell them you oppose SOPA and PIPA, and any similar legislation. SOPA and PIPA will affect websites outside of the United States, and even sites inside the United States (like Wikipedia) that also affect non-American readers — like you. Calling your own government will also let them know you don’t want them to create their own bad anti-Internet legislation.

For New Zealanders, that’s the Ministry of Foreign Affairs and Trade. Their contact details are here.

Megaupload

Megaupload’s website was taken down a day after the protest (without trial), with related people being arrested in New Zealand, and property confiscated. Are we okay with helping enforce US copyright law which, as SOPA and PIPA shows is heavily influenced by the entertainment industry? Is this what extradition should be used for?

It appears, at first glance, that Megaupload was removing infringing material on request. Although it seems their take down procedure was molded around the way they store files–only storing one copy of it if it is uploaded more than once, but giving out a unique URL for the file.

Megaupload has many similarities to other websites, which makes this concerning. It was definitely used for legitimate and legal purposes by legitimate users.

Tech Liberty asks do we need to obey laws from other countries while on the internet, if so, what countries?

Even if I have a web host in one country, what if they provide services via another country? The internet is so connected, how do we know whose laws apply?

Image credit: LoveNMoreLove/Wikipedia

Megafail: Universal Music Gone Rogue

Megaupload uploaded a $3 million+ viral video attempt in the form of a song, The Mega Song, to YouTube. Containing endorsements from many musicians that have contracts with Universal Music Group, they weren’t the happiest of campers.

Macy Gray sings in the video, which features will.i.am, P. Diddy, Kanye West, Kim Kardashian (who comes running whenever someone utters the word “endorsement”), Lil John, The Game, Floyd Mayweather, Chris Brown, Jamie Foxx, Serena Williams and Ciara on camera. (Side note: It’s accepted that Chris Brown can do endorsements now?)

Using YouTube’s content management system, which Universal has access to as copyright holders, they took the video down. They didn’t own any content in it. They just didn’t like it.

The lawsuit

Now Megaupload aren’t the happiest of campers, and are suing Universal, trying to prevent Universal from interfering with the video, which is now back up, after YouTube appears to have asked Universal as to why exactly they took it down.

The New Zealand connection (read: Universal don’t know what their own artists sound like)

Apart from Kim Schmitz/Kim Dotcom, Chief Innovation Officer at Megaupload having a house here in New Zealand where he also has permanent residency (which he celebrated by giving Auckland a $500,000 USD New Year fireworks display), Universal claimed that they took down the video because it contained content from one of their artists, Gin Wigmore.

Wigmore, of course, doesn’t appear in the video at all, in audio or visual form (but was approached to sing in it), so perhaps Universal have forgotten what their artists actually sound like, and mistook Macy Gray for her.

will.i.am

Two takedown notices were received, the second one from will.i.am (well, his lawyer), who appears in the video, saying “When I’ve got to send files across the globe, I use Megaupload”.

Ira Rothken, lawyer for Megaupload, says that written permission in the form of signed Appearance Consent and Release Agreements were provided by everyone in the video, including will.i.am. will.i.am’s signed form, which you can read here (pdf, will.i.am’s real name is William Adams), is pretty convincing.

The Hollywood Reporter has Ken Hertz, will.i.am’s lawyer, says that he “never consented to the ‘Megaupload Mega Song’”. Because he delivered that line to camera for another reason?

Dotcom says that will.i.am assured him that he “had not authorized the submission of any takedown notice on his behalf”.

Universal’s takedown rights “not limited to copyright infringement”

Universal claim that they can takedown the video under an agreement with YouTube–not the Digital Millennium Copyright Act. In a letter (pdf) to YouTube from Kelly Klaus, a Universal lawyer, says that “As you know, UMG’s [takedown] rights in this regard are not limited to copyright infringement, as set forth more completely in the March 31, 2009 Video License Agreement for UGC Video Service Providers, including without limitation in Paragraphs 1(b) and 1(g) thereof.”

In that case the DMCA’s rules and protections around takedown notices wouldn’t apply. If this is true, YouTube isn’t exactly open about it. They claimed that the video had been taken down by a copyright claim in the message displayed when people tried to watch it:

Mega Song block notice on YouTube

Rothken says “What they are basically arguing, they can go ahead and suppress any speech they want without any consequences. That’s not a workable paradigm”.

 

This is, perhaps, a huge tick in the column against the Stop Online Piracy Act, which is currently being debated.

Streisand effect, here we come.

Image credit: TorrentFreak