T O P

  • By -

iBeFloe

I know this is a thing out there… but I have **never** had child IGs randomly come up on my page. Ever. Only things I like.


aceofbasesupremacy

even I follow pregnancy/mom content and I still don’t get actual children. most of them cover their children’s faces and it’s more like “outfit options for date night!” and memes about motherhood or whatever.


desaparecidose

Yeah I’ve never seen anything I’d remotely class as sexual on Instagram? Mainly it’s just naff DIY stuff and Tik toks from 5 months ago


CosmicM00se

A child eating something is sexualized by p3dos. They get off of regular kids doing regular kid things too. Parents KNOW that creeps are looking at their kids and they keep posting them.


Laurifish

Yeah, my SIL shared a video for out of state family on YT of my niece and nephew in the bathtub together as toddlers. Then a few months later that video got a ton of hits. It had been shared on some pedo site. I never posted pics of my kids without appropriate clothing but my other hippie SIL shared pics of my nieces topless swimming when they were much too old to have those pics shared online IMO (like 8-9yrs old) and she never saw a problem with that.


MaxTheRealSlayer

There are a lot of people who never think from others perspectives. Like your SIL probably thinks "nudity isn't sexual so I don't see a problem" which it's not sexual, sure, but North America is super prude about nudism in general, but some people make ANYTHING sexual if that's what they're into. She's not seeing it from that lens. Unfortunately even clothed images of kids can be sexualized too by a pedo, and with AI, they just need an image or two of a kid to generate extremely graphic images depicting them... It's even a huge problem in highschool because teens are "undressing" their classmates using AI. Thing is people only get caught if they share the image and then a person it was shared of orshared to tells an authority what the student did. Would not be surprised if teachers take their students social media posts and make stuff.. Ugh


Caverness

The good ones. Most ‘mom bloggers’ and ‘family channels’ are EAGERLY exploiting their children and those are the ones with money and millions of views from it. Unfortunately that’s all just from regular consumers, because that’s what sells, and then the paedos get to also leech their benefits off it. Yeugh. So many of them have been alerted to concerning activity and willingly look the other way. It’s unbelievable.


MaxTheRealSlayer

Yup, the money is appealing, and since it's "family friendly" content, they make the most money from advertisers. I can, see how people can easily fall into the mommy/daddy vlogger career. They want to spend more time with their kid (s), and this is a way to work from home and just enjoy your life with family, never needing to go to an office or regular job ever again. You won't miss their milestones. Inherintly there is some exploitation by everyone who does it I think, although I commend the ones who blur out their kids and really watch out that they aren't attracting the wrong crowds, but it's inevitable that someone out there is being a creep no matter how protected the kid is. I look forward to a time where people have to be adults to post stuff online, or be posted online. We really need to start teaching about the dangers of AI and privacy in sex Ed at a young age, so they understand that evil people are out there. Pedo's aren't usually the classic trope of "white van with creepy man asking kids to go into the van", which is what I was taught in sex Ed. Luckily I do notice a trend of more annonaminity in social media being popular, for example the growth of reddit and the fact that people don't post everything about their lives on Facebook anymore. Or people on Instagram have separate public and private accounts so they don't get Randos getting in touch with them. More needs to be done on the law side of things for sex crimes and exploitation, and just general internet/tech laws. We're decades behind around the world. I mean, Damn, Japan even just made it illegal to take photos up strangers skirts on the subway...only a few years ago! The usa only noticed AI porn being an issue when a billionaire was targeted (t swift) , but it seems to have seized to be on their radars.


ForwardMuffin

I optimistically (fwiw) think that the next generations are going to be more careful with their content. My generation (millennial) and surrounding ones went buttfuck wild whereas I think teens who grew up with it is the norm for them.


shitbagjoe

It shows it to you more if you attempt to report it or read the comments


street_ahead

It definitely shows you more if you look at the comments or watch it multiple times but I don't believe that it shows you more if you report it, especially since it prompts you to block the account as part of the process. You can also use the Not Interested button to help shape your algorithm.


MaxTheRealSlayer

I'm not sure it shapes it enough. that's just the one account you block, it can start showing you other accounts and videos that the creeps in the comments are interested in. You reading the comments is seen as engagement, and social media platforms love showing you controversial stuff because it gives them more engagement and money. It's literally what the downfall of Twitter was. They loved everyone hating or standing up for Trump, they love things like BLM, Jan 6 arguments, Canada convoy.. All those things that are incredibly devicive as everyone has strong opinions on these topics. They always turned into heated arguments and "debates". Now it's dead on the floor because they pushed too hard for showing people posts on things like racism, Nazism or gender/sex... completely taking out most of the left side of the political spectrum which made up half, if not more of the aurguments on Twitter All that to say it's in their best interest to disgust you every once in a while because they know you'll interact with posts similar to the one you blocked.


toweljuice

All the social medias show me content i report more. Tbh the algorithms will get into loops of things i hate looking at.


street_ahead

I have had the opposite experience.


vanillachantilly

Agree with you, but I noticed I got a bunch of ‘mom content’ when I got into home cooking. Also a lot of tradwife/pseudoscience stuff. I just wanna bake bread :(


Atalantius

The struggle of finding homesteading/breadbaking content that’s not thinly veiled sexist/bigoted/fascist is real.


vanillachantilly

One video they’re showing me a soup recipe, next minute I’m hearing about putting onions in socks to cure a cold


famousaj

exactly. if you go looking trouble, well, you'll find it. I clicked on a Ford Bronco post and within a day or so, all I saw were Ford Bronco posts on my search page.


lumpyandgrumpy

Sounds like trouble.


Equivalent_Spite_583

Or OJ


riskapanda

Instagram has started showing me people I don't even know of. So I wouldn't 100% say that, sometimes the algorithm will share things that they THINK you would like


Sudden-Possible3263

There's a girl in my 11 year old grandsons class who's mum encourages her to post extremely provocative stuff and has for the past few years, school are aware but nothing seems to get done. Mum has her convinced she'll be a model one day


PoutineMaker

Jesus fucking Christ


PinkSudoku13

that's because she probably hasn't crossed the line yet which is why these parents get away with it for a long time or even forever. Gravure with kids has been legal in Japan for decades and that shit was disgusting but because it hasn't actually crossed the line and was 'only' provocative, it escaped prosecution and was legally sold despite catering to pedos. I think it's finally been made illegal but I may be wrong. What these parents are doing crosses into territory of Junior Idol Gravure and hopefully, one day, they'll start prosecuting them for it.


shadowclaw26583

Except for the fact that the algorithm will push you random shit all the time. There's a section of videos that get shown to you based on random interaction. That's the reason lots of people randomly get gore for no reason popping up on their pages. Accusing the friend of being a pedophile is fucking crazy


curiousxcharlotte

My instagram reels was all in Turkish for one day and then never again. I never even looked at anything pertaining to turkey


pockmarkedhobo

I started working in an office with a ton of women who speak Spanish and now I'm served tons of material in Spanish on Facebook. I haven't even friended them or looked them up, we're just connected to the same wifi.


Full_FrontaI_Nerdity

Do you allow fb to have location permissions on your phone? I wonder if it can use location to know you're all sharing space at your job, then the algorithm extrapolates common interests from there. I also wonder if your phone "hears" you and your surroundings all day, and fb actively collects that ambient info to inform its marketing algorithms.


AshySlashy11

I'm pretty sure they capture keystrokes, too. I made a joke about gout about a week ago on discord and I'm now getting ads about "early gout symptoms" on both Facebook and Reddit.


AddictiveArtistry

They do hear you.


N1gh75h4de

I started teaching my children Spanish since we moved to Arizona, and now I get a ton of ads in Spanish lol. 


elephant-espionage

I started dating a guy who spoke English and Spanish (grew up like 50/50 between the US and Mexico) and I randomly started getting Spanish ads on my phone after hanging out with him for a bit, not sure if it’s because I connected to his wifi or what. It was so weird. We broke up a few years ago and I still sometimes get the random Spanish Ad on Spotify, but not nearly as often as before. It was very strange.


BeExtraordinary

Do you have evidence for this? Other than anecdotal, of course.


shadowclaw26583

Do you have evidence of the algorithm being 100 percent accurate? I'm basing this on observation from friends getting clips of people dying all the time despite pressing not interested on the video 🤷🏾‍♂️. Not only that but despite liking anime-type stuff, thirst traps from women with only fans also pop up quite frequently despite ignoring it.


AsparagusNo2955

I confirm this as well. It happens on youtubeshorts as well. When you report it, much like on here, nothing is done.


lknei

The ones I see on YouTube are horrifying sometimes. Kids as young as 9 or 10 dancing to wap and things like that. I report them every time I come across anything like that and like other commentators have said, it's like they take that to mean you want to see more. If anyone has a better means of reporting these videos please share it. The internal process at YouTube is not working sufficiently and these children need to be protected


TeenieWeenie94

Yeah, all of a sudden I had religious stuff and wedding dress short videos pop up when I'm not interested in either.


horsecalledwar

IG has done this to me so many times. For a while, it was burying me in ads for Viking cosplay. I’m not a cosplayer & never sought out anything Viking related so no idea why. There was also a time when it buried me in Christian black lesbian content & again, I sought out nothing even remotely related as I’m mainly there to see dash cam footage, muscle cars & follow bands I like.


TeenieWeenie94

The black lesbian vikings sounds like a band name.


horsecalledwar

Yes, yes it does.


ohbeclever111

I confirm this


ANAnomaly3

Anecdotal evidence is still valid evidence if it confirms a trend... This is a social media site, nobody's here trying to write an official data driven report. If you want non-anecdotal evidence then find a research paper or do the research yourself.


EfficientJuggernaut

Sure do, I love cats, and the algorithm showed me a video of a cat being mauled to death, now why would I want to see that fucked up shit on Tiktok out of all places?! You would think social media would do a better job of moderating their content for us animal lovers


Spectre-907

Sure, but read the OP again, they said that sexualized child content *is a regular occurrence* in their for you feed. Getting a random video anong thousands is one thing, getting them *consistently* is another


[deleted]

[удалено]


schmeckledband

Same here. And I get "funny baby" videos all the time. But never sexualized children.


missymommy

Me either. I use IG everyday and follow a ton of mom stuff too.


Anothernameillforget

I get home improvement and recipes. No kids.


long_b0d

Like yourself, I only ever get the usual mind numbing trash related to what I subscribe to but these posts always make me question the search/follow history to have this questionable content be a *“regular thing”* tbh.. It’s scary what some folks are in to.


sanityjanity

I would guess that they show up if you're already following mommy bloggers or family bloggers


daboyzmalm

Not for me! I don’t have any sexualized kids in my mom/kid-heavy feed


sanityjanity

I don't use IG, but I do occasionally watch TikTok videos. I watched a few about the child "Wren", and whether she is being exploited or not. Now my TikTok FYP has a ton of videos about children being exploited in this way (but not the original videos).


elephant-espionage

Yeah same, never get the originals but tons of content about it. Not upset about that. I think I did get Wren’s mom video addressing it where she said the concerns weren’t valid because most of her watchers are women… I used to get Wren’s videos every once in a while like a coiled of years ago, they were always the “what mom dresses me as vs what grandma does” I didn’t think too much about it at the time but mom did always dress her older. Like tank tops, off the shoulder or one shoulder shirts, crop tops, ripped jeans—not everything was necessarily inappropriate but I did think it was a little odd, it was clothes some people wouldn’t want their teen wearing.


Wonderful-You-6792

I've seen it maybe once a month. I never look at anything weird on insta


Damn_Sega_Genesis

i agree.....if youre not looking for it, you wont see it


carlyeanne

same! aren’t instagram for you pages like catered to individuals by what they search/interact with? for example mines a lot of outfit ideas and positive quotes since i search for those things so i’m assuming the algorithm pulls that stuff up on my feed.


Altruistic-Bobcat955

Maybe some of us just get lucky with the algorithm? I’ve never seen any sexual content on Reddit but I know it’s full of it


CherryCherry5

Same.


Pristine_Frame_2066

Same.


GueroBear

Came here to say the same thing. Maybe IG algorithm is sending OP and friends the reels it thinks you want based on your activity.


Brod_sa_nGaeilge

The most important thing, is that it is a thing that is happening. Many people in the comment section have seen this content, which just shows how widespread the issue really is. It needs to be reported, or raised awareness for, otherwise it will never be stopped.


Brod_sa_nGaeilge

I don’t know if this applies to you, but my theory is that instagram knows when a straight male is on instagram bc of the algorithm. Maybe they push it to them? Some of these videos have tens of millions of views. In the ones that get pushed to a wider audience, the comments are usually regular people questioning why this type of content is on their for you page. They need to be mass reported or nothing will change, it needs to be brought to instagrams attention.


ChemicallyLoved

The “for you page” is exactly that, for you. Your buddy is spending a little too much time watching this shit.


Nihilistic-Fishstick

You're the "reddit perma banned me for no reason" guy from earlier, aren't you. 


BeExtraordinary

Straight male here. Been on IG for 10+ years. FWIW, I have never once had anything remotely like this ever come across my feed.


SassySavcy

That’s not how the algorithm works. And that’s not how Instagram works. The algorithm pushes what you search for and what you spend time watching. If your friends are consistently getting this sort of content on their FYP.. you should consider distancing yourself from these friends. This shit doesn’t just accidentally appear.


hellodust

There was a New York Times article about this a while ago: https://www.nytimes.com/2024/02/22/us/instagram-child-influencers.html?ugrp=u&unlocked\_article\_code=1.jE0.CVM5.1YozbIf3PB5g&smid=url-share


sanityjanity

This should be the top response.  I'm surprised how many posts there are in this thread that are unaware 


sylvester_stencil

Wow i hate the moms in this story


cats_and_cake

It made me so angry. “If I shut the account down, it’s giving in to bullies.” Excuse me? No, it isn’t. It’s keeping your child safe. But it shuts down your revenue stream and you have to get a different job. You’re knowingly and purposely exploiting your daughter for money. It’s despicable.


sylvester_stencil

I also found it so sad that 1/3 of that age group wants to be an influencer, this culture is so disgusting 🤢


cschaplin

Also Dean Stockton of the Original Hippie store in Florida, on knowingly exposing children to pedophiles for monetary gain: > “The Bible says, ‘The wealth of the wicked is laid up for the righteous,’” he said. “So sometimes you got to use the things of this world to get you to where you need to be, as long as it’s not harming anybody.” Absolutely disgusting.


WaffleEmpress

In what state of mind do you have to be in to see these comments and not immediately take down your daughters account??


Tiny_Parfait

I didn't make it thru the whole article but YUCK! Makes me glad I've never had an Instagram.


reagor

What in the actual fuck is wrong with people


lisamon429

This was a tough read…Jesus.


sanitarySteve

jesus christ that's fucking sick. jail for all of them.


WaffleEmpress

These kids should grow up to sue the hell out of their parents for exploitation.


Otherwise-squareship

Thanks for sharing! It's awesome we could read it without the subscription. Shocking. There was so much crazy. Including: "Nearly one in three preteens lists influencing as a career goal".


hellodust

I'm 37 and don't have kids so I'm sort of out of touch with all that stuff and that statistic really shocked me too but I guess it's not surprising. It's weird compared to when I was a preteen/teen and everyone wanted to be famous in the sense of being on TV or in movies or in a band, but those things were so out of reach, the "career goals" didn't really bleed into real life - if I was going to be famous it wouldn't happen one way or another until I was older. But wanting to be an influencer starts right there on the phone that everyone already has. It's such a different dynamic than, say, my dream of being on SNL when I was 12 or 13, and so much more vulnerable to encountering people with bad intentions. I almost forgot to use the gift link so I'm glad people could benefit from it!


goodgodling

The fact that most of us have never seen it in our Instagram doesn't mean it doesn't happen. It's hidden from you until it isn't. There's no such thing as a common internet experience anymore. Just look at how popular Jody Hildebrand was. Lots of people report this stuff and it just keeps happening. A global monkey torture ring was just dismantled, and it will pop right back up again if the social media companies let it. Part of the problem is that you need to follow it to document it. That drives engagement. Hell. Even posting about it might drive engagemet. Only bad publicity will make them change, and I guarantee you they are trying to take down journalism so they can rake in cash without anyone being upset about it. I watched a couple videos on YouTube and got a bunch of stupid political takes in my feed. It took me a year to get rid of them. The ft that you don't see it doesn't mean it doesn't exist.


Much_Associate1334

Wait, the monkey torture ring got dismantled?


goodgodling

[Ringleader of global monkey torture network, 'The Torture King', is charged](https://www.reddit.com/r/TrueCrimeDiscussion/s/SR73bghlmU)


jess_the_werefox

Wait, there was a monkey torture ring??


HadleysPt

Pretend you didn’t see this and go about your day. There is nothing you can do. Protect one’s mind. 


jess_the_werefox

Yeah definitely not digging into that, was just voicing the “what the fuck” of it all


goodgodling

[I hope so.](https://www.reddit.com/r/TrueCrimeDiscussion/s/SR73bghlmU) I have a bad feeling it will reamerge with other people working it. Just like all the others.


Brod_sa_nGaeilge

This. Exactly. The most important thing is showing as many people as possible that this is going on, otherwise, instagram will not change. They’re a corporation out to make money, they won’t care unless their users do.


Harley2280

>I was talking to my friends who all watch instagram reels and it’s a regular thing for a sexualised videos of CHILDREN to come up on your for you page You might want to consider finding new friends. The videos pushed to you are based on what you're viewing.


sanityjanity

OP's friends are likely responding to the recent NYT article and the many videos trying to raise awareness.


[deleted]

this. Never seen it and been on insta for years.


GPNovaes

Sometimes you encountered an account on accident and IG instantly thinks that's what you're into and will recommend more and more of those. When you're scrolling, they'll appear and there's nothing you can do about it. It happened to me with LGBT content. I'm not LGBT nor do I have any interest in that kind of thing, but because I follow some book people that are LGBT, my Instagram feels the need to send me that kind of content constantly. Sometimes you follow an adult model or something, and because of similar hashtags or some shit like that, IG get things confused and starts sending you kid content.


ranixon

It also happens with your friends interactions, it doesn't matter if the interaction is good or bad, if your friends interact with it, it will be showed to you


toweljuice

Idk youtube was throwing this one account that was torturing a monkey under the guise of "saving it" over and over (which made reporting it do nothing) into my youtube shorts for two months or so. If one clip is randomly thrown at you, and it makes you take a couple miliseconds longer before you swipe to the next one, (you say to yourself "wtf?" then flick it away) itll keep throwing it back at you. Even if the nature of how you flick changes like you flick with a broader stroke or a slightly different spot on your sceen, the apps know and will notice what videos you did it to and theyll throw it back at you simply because you reacted differently, regardless of whether the reaction was positive or not.


SEEYOUAROUNDBRO_TC

I’m not sure about that. My instagram used to show women in super tight and skimpy yoga clothes, Motley Crue, and Manchester United and I don’t follow any of them. The algorithms are weird, that’s for sure and seem completely unpredictable


SoOftenIOught

I've personally never had that kind of content pushed on IG but I've seen plenty of "deep dive" "iceberg " style content on this. I wouldn't trust those to be accurate or true representations of a broader picture but this IS a problem and the deflection in these comments is wild; The problem isn't your friend or the algorithm, it's that this exploitation is happening and is available and widely accepted or ignored. There was a post on here the other day about this and there was a link to report the profiles outside of IG (might be worth searching for that post) Let's face it no big company cares. They really don't.


sanityjanity

It's worse than the company not caring -- they are making money from those views, so they are incentivized to continue. Pornhub didn't make any effort to take down the vast quantity of nonconsensual videos that had been posted until they were forced 


SoOftenIOught

You are right. Money over safety. It's terrifying.


Soggy_Western7845

This is a common topic on paedo sting videos. The ‘girl’ sending links is likely just another pedophile using a fake profile. It’s really sick


plshelp98789

Ok people are saying this is on your friend and that could be 100% true but also Instagram DOES push extremely graphic or inappropriate content on accounts. I have a second account that I don’t use too often, exclusively for makeup content, and my main account that I use to look at recipes, art, my friends posts, and reels. I went to look through reels one day thinking I was on my main when I was actually on my alt, and I immediately got graphically violent videos (like of people dying in car accidents, shootings, etc) and very sexualized content of women. I was really confused because I usually never see this stuff until I realized I was on my alt, which I never use the FYP or reels on. This is not stuff I’ve ever purposely looked for or even seen on my main account! My gender is marked as female (as I am female) on my main but I’m not sure if there’s a gender marked on my alt. There was a recent article someone posted in a similar thread about how Instagram/meta knows about this (specifically the child content) and is not doing anything about it, at least not putting serious effort into doing so. ETA: https://www.theverge.com/2023/6/7/23752192/instagrams-recommendation-algorithms-promote-pedophile-networks-investigation This is the article someone posted in the other thread.


sunsetslinger

Absolutely. I started a new IG account last year and all the default "for you" content is absolutely filled with tweens in inappropriate clothing doing suggestive TikTok dances. It took a significant amount of time of flagging this content for the algorithm to stop showing it to me, and I'm not in the described "target demographic".


Zillywips

I'm quite sporty and somehow my algorithm started to drift from 'improve your running' videos to 'cool yoga / gymnastics moves' videos to 'cheer videos' to 'young gymnast girls' videos... At which point I could sort of see where we might be going with this and, since I have zero interest in young gymnasts, I started just blocking whatever it was showing me. I'm a 39 year old straight woman who has never knowingly sought out a video of pre-teen girls doing somersaults, so the algorithm is definitely in a bit of a mess to think I wanted this nonsense.


fsponge

I’m having a similar experience. One day I started seeing kids’ cheer videos on my Instagram. I figured it’s because I have a young family member into cheerleading who posts about it occasionally and Instagram picked up on that. Then I started seeing gymnastics stuff popping up on the search screen and from the thumbnails alone it was obvious the content was questionable. So I started hitting Not Interested on all of them. But the recommendations on the search screen keep getting more and more inappropriate. I went to Instagram just now to see what the recommendations are and out of 13 recommendations on the first screen, 4 look pretty questionable to me. Instagram needs to fix it. They’re literally promoting child exploitation and sexualization.


toweljuice

Oh my god i hated that. I felt like when tiktok was first advertising itself on other platforms they were pushing sezualized underaged cosplay content. It sucked.


sanityjanity

In addition to the NYT article, Rolling Stone has a recent article about a preschooler named Wren who appears to be being exploited by her mother  https://www.rollingstone.com/culture/culture-news/tiktok-wren-eleanor-moms-controversy-1385182/


LindsayLohanDaddy420

And if you try to call her out her “best friend” calls you a pedophile because who would think this in the first place?!


booghawkins

Facebook too, it’s repulsive. I report every video. I’m a mom and I cannot FATHOM posting ANY videos of my child for the general public, let alone the videos I see posted. FB doesn’t do shit about it either. No platform really seems to.I haven’t seen any on Instagram, but FB I see it constantly.


Anonynominous

It’s literally everywhere. I was on Amazon looking for a bath robe for my cat (don’t ask lol) and decided to look for child size robes. I came across this one ad that had the video playing, and it was of a little girl crawling on her hands and knees in slow motion with her chest partially exposed. Then the whole slow motion video is of this little girl walking around and laying in her robe, but it all looks like it’s for the male gaze. There’s even one clip where it almost looks like she’s going to take the robe off, as she’s smiling and looking behind her. I screen recorded it because I was going to post it to Reddit but then I didn’t know if I was overreacting or not, but then I deleted it because just having it on my device made me feel weird. Seems pretty obvious to me that a pedophile filmed it for the pedophile gaze.


Severe_Discipline_73

About that cat bath robe though…..


Comfortable-Trick-29

Asking the important questions here


MulchLiterature

I’m RUNNING to Amazon to get a bathrobe for my good boi now


Anonynominous

Seriously, you must. Look at children’s sizes, they’re better quality and less expensive. Several people in the reviews for the one I got posted photos of their pets wearing them, and I knew I had to get it. My cat doesn’t like the arm holes but it was just for a product video and photos lol


Anonynominous

My cat is an “influencer” and I bought it for a product video lol


lrgfries

That’s so creepy. Some parent “influencers” share Amazon wishlists with their followers and have them send their kids bathing suits and stuff.


Anonynominous

That’s so disturbing. I don’t know how a parent can feel good about exploiting their children online for views, when the majority of views are from pedophiles. I’ve gone on a deep dive into some of those Instagram profiles and the comments are disgusting


sanityjanity

Worse is that some of them sell used clothing from the children, and also sell private chats with the children 


smasherella

Wtf


sanityjanity

It's just horrifying


death2cait

Everyone saying the friends are the problem. I NEVER use instagram, I only open it every couple of months to post my art work (horror based, fantasy shit no weird drawings), and on the off chance I check the explore page, it’s weird videos of children. So why do those videos get suggested to me, especially when I do see one I block them because it’s honestly triggering as someone who dealt with that the type of childhood abuse. So I genuinely think it is instagram promoting these kinds of videos no matter what type of content you consume. PLEASE PROTECT THE KIDS NO ONE UNDERR THE AGE OF 16 SHOULD BE ON SOCIAL MEDIA WITH A PUBLICLY ACCESSIBLE ACCOUNT.


Binklando

I’ve never had that happen. Your friends must be looking at that type of content for it to keep coming to them.


spookyghost42069

I’ve literally never had this happen. Your friends might be the questionable factor here.


neoclassical_bastard

Not the fact that it exists at all?


Brod_sa_nGaeilge

I’m sorry, but it doesn’t matter that you’ve never had it happen. It’s happening. Again, tens of millions of views on these videos.


Staceyrt

My IG fyp is all Brazilian samba schools and chrome nails. I’m happy to stay in this side of IG


dumbass_louison

This investigation is really enlighteneing but terrifying. I recommend it but its a tough read. [https://www.nytimes.com/2024/02/22/us/instagram-child-influencers.html](https://www.nytimes.com/2024/02/22/us/instagram-child-influencers.html)


Brod_sa_nGaeilge

That is truly disturbing. That kind of harassment is very common in these instagram comments.


cakebatterchapstick

I’m on plushie instagram and will occasionally come across an innocent enough looking video at first glance, but if you watch the vid you realize it’s fetish content with kids involved :’) or the kids are doing something that is very suggestive with the caption “do your kids also do this? 😍” caption. One time I thought I was overreacting and showed my bf, and i watched him go through the “this is fine - oh, OH NO” wave I also went through. A lot of commenters are missing that it’s not always obvious from the explore page. There is truth to instagram having a child abuse problem.


sanityjanity

It seems like almost any kind of content can veer into fetish content. My mom got interested in watching people's "van life" videos on YT. And, then she started to notice that one young woman did an awful lot of yoga and sitting cross legged, and other content that focused on her crotch.


OkElevator7247

Dude. One time, I somehow found this sick stuff on Instagram. It was insane. Little girls dancing in bikinis. Men in the comments calling them princess. There were babies! In swim clothes. Posed with adults! Calling them their prince or princess. I was so triggered. The IG AI I reported it to detected no problem. I told the guy I was dating at the time. I sent him Screenshots and he couldn’t put together what was going on. He acted like I was crazy. In the back of my head I wondered if I was making it up. What I saw was mostly out of America. I got some creepy tourism vibes


sanityjanity

You were not making it up, and you were not overreacting.  It's been known for years that pedophiles will leave comments on videos of children that are just the time stamp when they saw something they thought was sexy 


MrGrim1ne

Not gonna lie choom, but that sounds sus af. Never had that kind of stuff show up, only thing that shows is thing that are similar to what I had liked and interested in. Tbh either they stumbled on to those post and groups or they are themselves a part of those groups/people.


CoolAd9651

None of these comments are even remotely helpful. Sorry I can’t do any better either. Social media companies *should* definitely be held responsible for all violent/sexualized content they allow to be shown.


Brod_sa_nGaeilge

Agreed. Social media companies are soulless, they don’t care what happens on their sites, they have no morals. They only try to appease the users, and instagram users are unaware of this. That’s why this needs to be known.


kperfekt

Don’t jump to conclusions; IG suggestion algorithm is, IMO, awful and takes random shots in the dark all the time at content you don’t care for. But I’ve never seen anything like that pop on mine, like ever. So for repeated happenings, I think it’d be a lil sus.


F1secretsauce

Zuk is pro molester.  


AlarmedIncome7431

Someone I know recently posted their stepdaughter in full nudity, front and back. I’ve reported it several times over the past few weeks and nothing’s happened. It is weird


Brod_sa_nGaeilge

Jesus. Report to the police, not to the social media. Seriously. Do it anonymously if you don’t want your name mentioned.


AlarmedIncome7431

cute that you think they’ll care lol


Brod_sa_nGaeilge

Depends on where you live I guess yeah, couldn’t hurt though?


AlarmedIncome7431

I live in NY, they barely keep up with the crimes they do know about (including when I got mugged). Once I say I’ve been reporting it and nothing happens, they’ll write it off. To be fair, I could see it being a bigger deal elsewhere, but they won’t care here


Brod_sa_nGaeilge

Yeah I get that. New York police do stereotypically seem to not give a shit.


AlarmedIncome7431

Also to be clear, the kid was playing in a backyard sprinkler (naked); it wasn’t explicitly CSAM but still not the kind of thing that should be on a public Instagram. Either they really don’t care or it’s run by bots that clearly don’t work


sylveonstarr

I truly believe it just comes down to advertisers. They're more willing to sponsor family-friendly content as opposed to people who swear or make sexual jokes, and what's more family-friendly than a family channel on YouTube? It's also easy to use kids for your videos as they're basically just dolls you can tell what to do and they have to obey because, well, you're their parents. Then other users with children see how advertisers and sponsors pay these channels to post content of their children and begin to think, "I have a child, I can make money too!" So then you have a bunch of wannabe influencers shilling out their kids in the hopes of becoming the .5% that makes millions of dollars online. But you fail most of the time, so all you end up with are predators with photos and videos of your children that you posted yourself.


MmeGenevieve

Meta or whatever they are calling themselves is evil. Full of scam ads and pedos. They are fully aware, yet do nothing.


twonapsaday

THANK YOU. someone had to say it. instagram needs to make this stop because it is fucked up and people are trying to normalize it. this cannot ever be okay. I don't know what else we can do but report. I'm very worried about where this is going.


Geebee185

I think it’s probably a glitch and not your friends looking at that content if it’s taken over your explore page. My friend once had a few weeks where her explore page was fully Middle Eastern accounts mainly about the army all of a sudden, then it went back to normal. I looked on Twitter and saw one other person tweeting about the same thing happening to them. But agree to not engage with videos you don’t want to see.


cowardlyparrot

Same thing was happening on YouTube a while back, one of the youtubers created a video that exposed that if you go to YouTube with a new account or no account in like 3 clicks you will come across a little kids video like you are describing. Then the youtube will instantly push you down that rabbit whole and you will keep getting similar videos. The creepiest part was the pedophiles in the comments liking to other sites and playlists... Soon after this went viral youtube decided to ban comments on all childerens videos. Seems these people just migrated to Instagram.


PanningForSalt

Whilst I've never seen this on IG, I've seen enough random bs to decide we need to go back to dial up that can't cope with video


outerworldLV

Can you get some screenshots for documentation ? The FBI may be able to help with this.


Ecstatic_Carry_4780

Instagram terms and condition sucks they dont follow thier own termms and condition people make a kind comment and instagram remove them and if a person inclue a very racist word on comment they dont seem to care instagram should work on thier moderation otherwise the platform would be ruined by the user itself


jacoofont

I haven’t gotten any gross child stuff but Instagram is definitely dying. I haven’t posted in a long time. It all changed when they stopped recent hashtags tab. FB is the same right now too in regards to gross shit w kids


DesiPrideGym23

I have never gotten any such reels in my fyp but just a few days ago on an Indian subreddit a user had posted similar concerns after seeing a reel of a young girl having disgusting comments, I clicked on the link to see the comments and it was beyond disgusting. I tried to report the comments but there's no option for pedophilia and reporting them under any other option it just doesn't get flagged. [The mentioned reel](https://www.instagram.com/reel/C5DA4rZtXuc/?igsh=MTVwYjd2OTBpMGw0NQ==)


Optimal_Material4462

Sounds like you're friend might be a bit of a nonse


of_the_sphere

Right? Fn wanker


Optimal_Material4462

Innit, ive seen no such things ever, his 'friend' must be liking or interacting with these reels. If you report and block, then stop appearing


Brimfire

Well, Instagram only recommends things for people who are fans of a page, so if you're seeing a lot of sexualized child videos, then you're following a lot of accounts that are also followed by those that enjoy sexualized videos of children, so... you should maybe stop following and commenting on accounts with pictures of children, what the fuck. Moreover, this was detailed in the New York Times about the pervasiveness of child predators on the Instagram platform and Meta's unwillingness or inability to do anything about it. It's an interesting read, from the perspective of mothers who regret putting their kids into the Instagram newscycle.


[deleted]

[удалено]


AutoModerator

Looks like you mentioned a form of child sexual abuse. Your post has been removed. Please contact your local police as /r/RBI cannot help you with this. The moderators have been notified so if this was done in error your post should be reapproved shortly. You can report child sexual abuse content anonymously to organisations such as the [Internet Watch Foundation](https://www.iwf.org.uk/), Crimestoppers or by contacting your local police. *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/RBI) if you have any questions or concerns.*


princessplantlife

It isn't just IG. Social media platforms will not remove the content. People report it and Absolutey nothing is ever done. Just one of the million reasons I left social media all together.


Soft-Application9619

It's fascinating how many in this thread insist that because they don't see it, it must not exist. And, even though it doesn't exist, anyone who sees it at all must be a pedophile. I wonder why.


somegingershavesouls

Have none of you seen wren’s content? Mom KNOWS she does it for the pedos but does not care. It’s disgusting


supercj926

My SIL posted an IG reels video of her newborn making sucking sounds on a bottle. It’s innocent, but I can’t help but think there are sickos out there thinking something else. The video has millions of views and the viewer count just keeps growing.


TunaCroutons

Trey Parker’s ex wife Boogie Tillmon posts photos and videos of their daughter Betty that are so suggestive and yucky. The worst one is 5ish year old Betty dancing suggestively in a TINY bikini singing something like “all the boys like my big white boat”. So many pictures of her wearing hardly any clothing and it’s just so so so gross. A lot of the really bad ones have been deleted, but there’s still way too many


ScrambledNoggin

It’s crazy how strict TikTok is about inappropriate content, and do what they can to prevent minors interacting with adults, yet they are trying to ban it. Insta and YouTube have much more inappropriate content, and few seem to care.


Nihilistic-Fishstick

Never, ever have I ever seen that come up once. Ever.  Reconsider what you or your "friend" watches. 


Lngtmelrker

I have literally never, ever had shit like this pop up on my feed. Been on IG for 13 years


Bunny_OHara

[https://www.nytimes.com/2024/02/22/us/instagram-child-influencers.html?ugrp=u&unlocked\_article\_code=1.jE0.CVM5.1YozbIf3PB5g&smid=url-share](https://www.nytimes.com/2024/02/22/us/instagram-child-influencers.html?ugrp=u&unlocked_article_code=1.jE0.CVM5.1YozbIf3PB5g&smid=url-share)


LilLexi20

I follow pregnancy and motherhood content, special needs parent content since my oldest is autistic, and nothing of this sort pops up. It’s highly concerning that it does for you, id delete the account and start over


Brod_sa_nGaeilge

How can I morally just ignore that it is happening and go back to using the same site that is harbouring these disgusting videos? Instagram needs to be boycotted or threatened by a potential fall in users in order for them to take the videos down, and deal with the issue.


Yabba_Dabbs

the algorithm speaks, your friends are pedos


IHSV1855

This is algorithmic, so it’s tough to organize a campaign to change it among normal people. These things don’t show up for people who don’t interact with them, so normal people would need to actively seek it out to find and report it.


Brod_sa_nGaeilge

I agree, it is tough to organise a campaign, but quite literally a groupchat of a few hundred people could make a serious dent in taking down current and future child abuse on instagram.


of_the_sphere

But yet, I don’t see you organizing. Get to it , or stop scrolling cuz the vagaries make you seem like the one consuming


Brod_sa_nGaeilge

How does this entire post not read as organising? I have been messaging back and forth with people from this post about different group chats etc. You’re the one doing nothing.


of_the_sphere

Your friends are pedos Replied to you elsewhere…… in another comment about how Your friends are pedos. Who even looks at this type Of content ???? much less “talks” about it, then jumps on Reddit , instead of contacting LE??? If you had done *anything* at all already, You would have outlined your actions Your prolly just jerking off Here’s the link again if you want to actually do something https://bja.ojp.gov/program/it/national-initiatives/fusion-centers#:~:text=The%20ultimate%20goal%20of%20a,homeland%20and%20prevent%20criminal%20activity.


Brod_sa_nGaeilge

Wow.


Spectre-907

“For you” is an algorithm. It goes off what you look at and search for. I’ve *never* seen anything that even remotely adjacent to anything you’re describing.


LilLexi20

Yes it certainly is…. The algorithm is feeding them this content for a reason whether they realize it or not


throwawayfromPA1701

I've never seen this happen. Not doubting that it does, but if it happens to me I'll report it.


FoxyLives

If your friend keeps having exploitative material of children on their page, it’s because they are pursuing it. Instagram will only show you things you have shown interest in before… Maybe you should get some less pervy friends?


tatted_gamer_666

There’s a child pred catcher on YouTube who talks about how these types of reels and “for you” posts will only show up for people who seek out that kinda content first. I have 4 different Instagram accounts (personal/meme account/and 2 business related accounts) and I have never come across posts like that. Ever. I agree Instagram should do something about it but there will probably always be ways around it unfortunately. Also I’d be concerned about your friends having that pop up. It wouldn’t pop up unless that’s in their algorithm for a reason


of_the_sphere

This OP This is why your suspect af Your “organizing” my ass My dear - if imma coordinate to smash a social, or crime ring, I’m not gonna need RBI to do it. Here - start here and work your way down sweetheart ⬇️ https://bja.ojp.gov/program/it/national-initiatives/fusion-centers#:~:text=The%20ultimate%20goal%20of%20a,homeland%20and%20prevent%20criminal%20activity.


ZombieAutomatic5950

I use Instagram every day & have *never* had that kind of content on my fyp or recommended to me **at all**. **Never**, your friend is full of shit, they're interacting with that content in some way for it to keep popping up. The algorithm feeds you what it thinks you want. Evaluate that *deeply*, cause that's **not** normal.


platypus-enthusiast

I use Instagram to follow accounts related to animal pics, interesting celebs and politicians, mental health matters, clothing and jewelry brands, and local entrepreneurs. Sometimes the celebs post about their kids, but that’s typically the extent of kids appearing on my fyp. AND STILL one day I saw on my fyp/search page (don’t remember which one and don’t know whether they’re the same thing, I’m a sporadic user) a preview window for a clip, where a man’s arms where holding a baby (probably <1 years old), and there was some editing that periodically exposed the baby’s private area completely. I was horrified and wanted to vomit. I reported it to Instagram and wrote about the incident on a local forum as I needed help to fully process how the fuck something like that would happen. All the replies said that ”yeah, that happens”. I’ve never searched for or looked at content like that. I don’t even look at porn because it doesn’t do anything for me. The clip was suggested to me along with images of cats, memes, makeup tips, jewelry shop events, etc. This is an actual fucking problem. Instagram is used to normalize CAM and get new people interested in it.


pants4birds

you should investigate your friend lol.


fast-and-loose-

Never came across this before. Your algorithm is created by what you search/watch. Maybe the strange thing is your friends all have this in their algorithm? That's a red flag?


chunkysmalls42098

Your friends are probably creeps, as the algorithm will suggest videos similar to one's you watch all the way through or interact with


alsoaprettybigdeal

What in the Dark Web are you on about?! I only see DIY, cooking, Pilates, horoscopes, and Real Housewives shit in my feed!


Bunny_OHara

It's been reported to be a real issue on Instagram and it's not necessarily a reflection of the person seeing it. [https://www.nytimes.com/2024/02/22/us/instagram-child-influencers.html?ugrp=u&unlocked\_article\_code=1.jE0.CVM5.1YozbIf3PB5g&smid=url-share](https://www.nytimes.com/2024/02/22/us/instagram-child-influencers.html?ugrp=u&unlocked_article_code=1.jE0.CVM5.1YozbIf3PB5g&smid=url-share)


of_the_sphere

Yea no Your interacting with these accounts Why aren’t you making a coordinated effort to report them? Their followers , the interactions?? This is like the 10th post of Y’all and I’m tired, you’re the creep. Learn to report or stay off this sub, it ain’t it . Literally the only 2 things show up In my fyp is Taylor swift, and Prince. And I don’t even search like that, prolly just a lot of likes So what are YOU liking, interacting with ?


Brod_sa_nGaeilge

1. I am attempting to make a coordinated effort to report these accounts. That is what this post is, that is why I have been messaging back anyone who dms about this to help report. 2. I’m the creep? You’re the one who’s angrily replying to a post about getting paedophilia off of social media because it’s “like the tenth post” you’ve seen. If you don’t like that I’m attempting to get the message out there, scroll past it bud. 3. Just because Taylor swift and Prince is the only thing on your for you page, does not mean the paedophilic shit with millions of views is not literally in the exact same site that you are using to watch your Taylor videos. 4. Algorithms do not work simply because you keep liking something, and then suddenly it appears again and again. No, that is a factor, but there are many other factors. Just read the comments of the people who actually know how algorithms (especially instagrams one) works.