Facebook share, the content share of Rs 1.62
Under IT rules that went into effect earlier this year, large digital platforms with more than 50 lakh users are required to publish compliance reports every month. It provides information on the action taken on the complaint received by the platforms. Additionally, this report also contains details of the content that has been removed or disabled using automated tools.
In October, Facebook took “action” on more than 18.8 million content in 13 categories. During the same period, Instagram took action against more than 3 million content in 12 categories.
In the latest report, Meta reported that between November 1 and November 30, Facebook received 519 user reports through its Indian complaint mechanism. Of these, 461 cases were resolved by delivering tools to users.
According to the report, of more than 16.2 million content processed by Facebook in November, 10 million content was related to spam. 20 lakh of items were violent. 1.5 million items were adult nudity and sexual activity. Apart from this, measurements were also taken on more than one lakh of keynote speeches. Measurements were also taken in many other categories. These include 102,700 bullying and harassment articles, suicide, and 370,500 self-harm articles. Action was also taken on threats related to children.
At the same time, of the 12 categories in which more than 32 lakh of articles were processed on Instagram, the maximum number of cases was related to suicide and self-harm. Measures were taken on 815,800 of these materials. 333,400 cases of violent and graphic content were eliminated.
- Apple paid $60 billion to developers last year
- Electric vehicles leasing service started, options from economy cars to luxury cars
- Spider-Man: No Way Home movie grossed $ 1.54 billion, surpassed these four movies
- PayPal in preparation to launch Stablecoin, the developer receives hints from the app
- Gmail becomes the fourth application on Android with 10 billion downloads
Was this helpful?
0 / 0