Content Marketplace Pepper Content Raises INR 31.6 Cr Funding Led by Lightspeed India

The company said it plans to use the fresh capital to expand to text, design audio, and video content in India and Southeast Asia

Grow Your Business, Not Your Inbox

Stay informed and join our daily newsletter now!

2 min read

You’re reading Entrepreneur India, an international franchise of Entrepreneur Media.

Content marketplace Pepper Content on Tuesday announced it has raised INR 31.6 crore (USD 4.2 million) led by Lightspeed India.

A clutch of angel investors, including Beerud Sheth (Founder, Upwork), Balaji Srinivasan (Coinbase CTO), Gaurav Munjal (Founder, Unacademy), Aakrit Vaish (Founder, Haptik), Miten Sampat (ex-CSO, Times Internet), Akhil Paul (Caparo Group), Utsav Somani (iSeed/AngelList), Dilip Khandelwal (Ex-MD, SAP Labs India) and Gaurav Mandlecha (Growth, Airmeet) also participated in the funding round.

The company said it plans to use the fresh capital to expand to text, design audio, and video content in India and Southeast Asia.

Founded in 2017 by Anirudh Singla and Rishabh Shekhar, Pepper Content helps content creators connect with businesses looking for content marketing services.

The company claims close to 30,000 content writers, graphic designers, language specialists and editors have applied to be part of the platform. Only 10 per cent talent worls on the platform due to the company’s high selection criteria, it added.

Content creators on the Pepper Content platform have created over 1 lakh content pieces, earning over USD 400,000 in the first two years. The company claims to generate over USD 600,000 in revenue and is targeting USD 1 million ARR in the current fiscal.

Commenting on the deal, Dev Khare, Partner, Lightspeed India said “We are proud to partner with Anirudh and his team with a vision to use the Pepper market platform to enable any company around the world to source content on-demand, with high quality, and at

Facebook bans Holocaust-denial content after allowing it for years

  • Facebook announced Monday it was changing its hate speech policy to “prohibit any content that denies or distorts the Holocaust.”
  • The company has faced criticism for more than a decade over its refusal to moderate anti-Semitic content that distorts or denies the Holocaust, when Nazis and their allies systematically killed 6 million Jews, happened.
  • In the weeks leading up to the 2020 presidential election, Facebook has attempted to mitigate criticism that it fails to prevent the spread of dangerous conspiracy theories and disinformation on its platform. Just last week, Facebook said it banned QAnon accounts across its platforms.
  • Visit Business Insider’s homepage for more stories.

Facebook has banned Holocaust-denial content from the platform after years of criticism over its refusal to take action against such anti-Semitic rhetoric.

Facebook announced Monday it was updating its hate speech policy to “prohibit any content that denies or distorts the Holocaust.”

The policy change marks an abrupt about-face on Facebook’s refusal, for more than a decade, to remove content from its platform that denies the existence of the Holocaust and the genocide of millions of Jews and other minority groups. The platform has faced pressure from human rights and civil rights groups to take a stricter stance against such content, but Facebook has maintained that the “mere statement” of Holocaust denial doesn’t violate policies.

“I’m Jewish, and there’s a set of people who deny that the Holocaust happened. I find that deeply offensive,” CEO Mark Zuckerberg told Recode in July 2018. “But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong.”

In the meantime, it appears that Holocaust-denial content on Facebook has continued to not just exist, but flourish. A recent study, published in August by

Revived All Def Digital Charts Broader Content, Partnerships Under New Ownership

A year after All Def Digital, one of the web’s biggest Black-owned digital sites, collapsed in the aftermath of #MeToo allegations against founder Russell Simmons, the reborn company is charting a new course led by two former tech executives and backed by an ownership group that includes music and sports notables such as T.I., Killer Mike, Jason Geter, and Baron Davis.

The new ADD is moving beyond the original platform’s tight digital focus on hip hop, comedy and slam poetry. Under new CEO Cedric J. Rogers and partner Shaun Newsum, the new ADD is exploring more genres and distribution approaches. It’s also expanding relationships and programming ventures with traditional media companies, working with WarnerMedia-owned FullScreen, and Comcast
-backed production company Jupiter Entertainment, to beef up its programming, creator networks, brand relationships, and several new initiatives.

Rogers and Newsum are engineers by training, working respectively at Apple and what’s now called Disney Streaming Services. By 2018, though, they had launched Culture Genesis to produce programming for underserved “black and brown” audiences online, Rogers said. The company worked with Kevin Hart’s LOL Network and Steve Harvey’s production company Harvey International before bringing to All Def Bar Exam, a digital game show about hip hop music.

Then, things fell apart for ADD. Simmons was one of a number of high-profile entertainment executives facing multiple accusations of sexual misconduct, and stepped away from the company in 2017. Last year, as funding and business relationships dried up, ADD went into bankruptcy.

Ultimately, through a bankruptcy process called Assignment for Benefit of Creditors, Culture Genesis was tabbed to take over the ADD assets, in part because it already had relationships with many

Facebook Decides Holocaust Denial Content Is Bad, Actually

Facebook has, for years, intentionally looked the other way when users shared content that denied or distorted the Holocaust.

That may finally be changing.

Facebook’s vice president of content policy, Monika Bickert, said Monday the company is updating its hate speech policy to prohibit Holocaust denials and distortions. 

A “well-documented rise in anti-Semitism globally and the alarming level of ignorance about the Holocaust, especially among young people,” Bickert said, prompted the long-overdue change.

The announcement makes no mention of how Facebook itself contributed to that anti-Semitic rise. The social media platform has become a clearinghouse for misinformation concerning virtually every subject, including Holocaust denials and anti-Semitism in general.

It’s unclear how Facebook intends to enforce the expanded policy, or how it will define content that violates it.

“Enforcement of these policies cannot happen overnight,” Bickert acknowledged in the announcement. “There is a range of content that can violate these policies, and it will take some time to train our reviewers and systems on enforcement.”

The company told HuffPost it will apply the policy to all of its users, including politicians. Politicians have enjoyed lax enforcement of Facebook’s community standards, thanks to a loophole the social media company created that protects their posts as “newsworthy content.”

Facebook had resisted calls to take down Holocaust denial content going back to at least 2011, when 21 Holocaust survivors pleaded with the company to deny access to users who promoted the conspiracy theory that Nazis didn’t murder 6 million Jews during WWII.

“By allowing this hate propaganda on Facebook,” the group warned the company in a letter, “you are exposing the public and, in particular, youth to the anti-Semitism which fueled the Holocaust.”

Facebook, at the time, nevertheless decided Holocaust denials didn’t violate its terms.

A Facebook spokesperson declined to address why the company

Pakistan bans TikTok for ‘immoral’ content

The Pakistan Telecommunication Authority (PTA) on Friday issued instructions to block controversial video-sharing platform TikTok.

In a statement, the PTA said the ban followed a number of complaints about the type of content shared on the app.

“In view of a number of complaints from different segments of the society against immoral/indecent content on the video-sharing application TikTok, Pakistan Telecommunication Authority has issued instructions for blocking of the application,” it wrote.

The PTA said after considering the complaints, as well as the nature of the content being “consistently” posted, it issued a final notice to the application.

The watchdog said it gave TikTok considerable time to respond and comply with its instructions for “development of effective mechanism for proactive moderation of unlawful online content”.

“However, the application failed to fully comply with the instructions, therefore, directions were issued for blocking of TikTok application in the country,” it continued.

“TikTok has been informed that the Authority is open for engagement and will review its decision subject to a satisfactory mechanism by TikTok to moderate unlawful content.”

In late August, a video of a man dying by suicide was posted on Facebook. The graphic video spread across other platforms such as Instagram, Twitter, and Youtube, but it continued to appear on TikTok weeks later as the app struggled to remove the horrific content.

The PTA had at the time asked TikTok to “block the vulgar, indecent, and immoral content for viewership in Pakistan”.

It asked the platform to put in place stronger content monitoring and moderation mechanisms so that unlawful material could not be accessed or viewed within Pakistan.

It also asked similar requests of Youtube, demanding the Facebook-owned site to “block vulgar, indecent, immoral, nude, and hate speech content for viewing in Pakistan”.

“PTA has done so keeping in view the