In the summer of 2009, Carla Franklin landed on a Facebook page created under her name.
She was shocked to see her own image staring back, a swimsuit photo from her earlier days as a model.
Worse yet, Franklin also found a YouTube channel dedicated to her, featuring a student film she had acted in years before. The page labeled her a prostitute and listed her contact information.
Franklin immediately suspected that the cyber bully posting the material was a man she went on a few dates with three years earlier. But it took months to get the companies to remove the content and a court order to force Google to provide the information necessary to link him to the posts. Three years later, she's still in the process of completing the case.
Franklin's experience highlights the long legal battle in store for most people seeking to unmask online harassers, and remove intimate, hateful or defamatory material from the Internet. It also underscores the tension between the rights of harassment victims and the rights of websites, which were granted broad immunity from the actions of their users through legislation that protects free expression online.
These challenges and frictions will be the subject of a panel discussion in San Francisco, during a fundraiser for Without My Consent, a nonprofit that helps to educate harassment victims and their attorneys on legal options.
Franklin, who now acts as an advocate on these issues, will appear on the panel, along with experts from Twitter, Reputation.com, the California attorney general's office and elsewhere.
The exact scope of online harassment is unknown, but it's clearly widespread.
University of Maryland law Professor Danielle Citron noted in a recent blog post that the Bureau of Justice Statistics estimated that "850,000 people in 2006 experienced stalking with a significant online component," while other researchers predict that 30 percent of Internet users will "face some form of cyber harassment in their lives."
It can take many forms, including hate speech, threats of rape and sexual violence, and posting of nude or doctored images. The threat of uploading intimate pictures and videos is sometimes used to blackmail victims for sex or money.
The results can be traumatic and tragic, as exemplified by the case of Tyler Clementi. In 2010, the 18-year-old Rutgers student killed himself after his roommate hid a webcam in their room and streamed video online of his sexual encounter with another man. Between the Communications Decency Act and the Digital Millennium Copyright Act, online companies enjoy broad protection from legal liability for the content created or posted by users, be it copyrighted or defamatory.
These laws basically ensure that companies like Craigslist, Twitter and Facebook can host open forums where people can freely trade ideas and goods. But they also mean that victims of legitimate harassment face a gauntlet of challenges in getting material removed and identifying those responsible.
It costs at least $10,000 in legal fees to issue a subpoena to an online company demanding the IP address that links a real person to an uploaded file, said Colette Vogele, co-founder of Without My Consent. It costs thousands more to file a lawsuit against the perpetrator.
Depending on the facts of the case and the state involved, the victim might be able to sue the person for defamation, publication of private facts, breach of confidence and other claims. Some acts can rise to the level of criminal offenses, including stalking and extortion.
There are risks in filing a lawsuit, however, as Franklin learned all too well. After she sued Google, she found her name and the details of her case splashed in the New York Post and New York Daily News, which dubbed the Duke and Columbia graduate a "brainy ex-model." Bloggers and commenters were far nastier.
Victim advocates like Vogele argue it should be much easier for victims to pursue remedies. Among other things, they believe more of the offenses should be considered crimes, it should be simpler to sue under a pseudonym, and courts should demand greater cooperation from websites.
"Courts can apply (the Communications Decency Act) in a less overzealous way," Vogele said. "We need to apply it in a way that doesn't protect speech that is harmful."
On that question, however, many free speech advocates disagree. It should be difficult to unmask anonymous commenters and remove online material because those hurdles protect free speech, said Matt Zimmerman, senior staff attorney with the Electronic Frontier Foundation.
If sites were required to respond to harassment claims that hadn't been evaluated by a judge, many would simply remove material by rote. They wouldn't spend the time and money to determine whether someone was legitimately harassed or just meanly criticized, nor would they be particularly qualified to make that call.
Current law "protects channels of speech but still allows people to pursue claims against the bad actors," Zimmerman said. "Yes, it costs you something to pursue your claims, but that's the social deal we've made."
As it is, some courts have applied too lenient a standard in certain cases, he and others say. Notably, in 2009, a New York judge required Google to reveal the name of an anonymous blogger who had called model Liskula Cohen a "skank" and "ho."
Here the critical question becomes: Is labeling someone a "ho" equivalent to describing someone as a prostitute - a factual claim that could be libelous if false - or is it, in common use, a generic criticism?
Or to paraphrase Zimmerman: Would a reasonable reader of the blog go away thinking, "Wow, I didn't realize Cohen had sex for money," or would they assume it was a crudely stated personal opinion?
As much as the blogger's actions were distasteful, the courts shouldn't serve as a "tax-subsidized private investigator" for celebrities, politicians or anyone else who wants to out and silence their critics, he said.
'I try to forget'
On the other hand, it's difficult to feel the law has struck the entirely correct balance when you hear a story like J's. The California woman (who didn't want her name used) made one mistake a decade ago that continues to haunt her.
She agreed to make a sex tape with her boyfriend at the time, with the understanding that it would be kept private. Instead, years after they broke up, it appeared online, edited to look like the debut film of a porn actress. Her real name was used.
With Vogele's help, J sued the man, winning monetary damages along with a requirement that he assist in the ongoing removal of the video and pictures online.
But the legal effort took years and the images had spread so widely by then that they continue to resurface. Her husband regularly searches for new appearances and sends take-down requests to sites. J, who has been contacted by strangers who have viewed the video, continues to screen calls and avoid social media.
"I try to forget, I try to pretend it never happened, but I know it will always be there," she said. "It's a nightmare that will always keep following me."
For more information, or for more on legal options for victims of cyber harassment, go to withoutmyconsent.org.