Drugs cartoon video


One of the videos was a Breaking Bad-themed cooking show where the host dressed up in respirators, making a joke about inhaling fumes while recreating the RV in a Minecraft project. Another alarming content by an Indian beauty influencer could subject kids to toxic body image issues. The video showed how to apply skin-bleaching products that are available for older kids. This is a product that, YouTube claims, does a lot of machine learning to filter out harmful content. We have a higher bar for which videos can be a part of the app and also empower parents to control what content their child can and cannot see.


We are searching data for your request:

Drugs cartoon video

Online bases:
Torrents:
User Discussions:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.
Content:
WATCH RELATED VIDEO: CGI Animated Short Film: \

How Media Use Can Affect Kids

After years of vowing to police inappropriate content, YouTube hasn't delivered. The YouTube video starts with a popular British children's cartoon character, Peppa Pig, introducing herself and her family, but there are signs of trouble almost immediately. During the ninth second, Peppa's mother opens her mouth and shouts, "Smoke weed! The video - a doctored version of a real Peppa episode - deteriorates from there. Over five minutes, there are explosions and racial and homophobic slurs, culminating with Peppa and her parents in dark sunglasses smoking marijuana as rapper Snoop Dogg dances nearby.

Since it was uploaded in , the altered video, which has no age restrictions, has been viewed more than 1. After years of vowing to police inappropriate content, YouTube continues to deliver violent imagery, drug references, sexually suggestive sequences and foul, racially charged language in clips that reach children at a troubling pace, say researchers, parents and consumer groups.

YouTube's recommendation algorithm, they say, fails to reliably segment content by appropriate age levels, and its default autoplay feature delivers almost-endless streams of videos that confuse, upset and titillate young minds.

Though many parents try to monitor what their children watch, the sheer volume of YouTube content that many consume makes that impractical, especially when a single, short clip can deliver an array of off-key messages. Steyer and others say YouTube exemplifies a tectonic shift in children's programming over the past generation - away from the federally regulated, time-constrained world of broadcast television. The increasingly dominant online world delivers all-but-unregulated content that can be uploaded by virtually anyone, almost anywhere in the world, and that can reach children at any time, depending on parental vigilance.

YouTube has consistently said its platform is not intended for children, and it created the YouTube Kids app in to satisfy the demand for more heavily curated content aimed at younger audiences. We don't allow users under 13 to create or own accounts on YouTube, and when we identify an account of someone who is underage we terminate that account - we terminate thousands of accounts every week as part of that process.

But parents, consumer groups and pediatricians report that YouTube itself is wildly popular with children, more so than the kids app. Children watch YouTube on mobile devices - their own and their parents' - as well as on internet-enabled televisions and via browsers on laptop computers. Through browsers, YouTube has no requirement that users sign in, except in cases when a video carries an age restriction, meaning there's no practical barrier to children watching most videos on the service.

Videos with age restrictions require users to sign in to an account before watching, so it can establish that they are at least This step can deter children, though experts say many children lie about their ages to create accounts on YouTube and other services. YouTube said its "review team" puts age restrictions on content that contains vulgar language, nudity, violence or dangerous activities, when such videos are brought to YouTube's attention.

Content with age restrictions cannot carry ads or be monetised by its creators. In a Pew Research Center poll last year of parents with children younger than 12, more than 80 per cent said their children watch YouTube, and 34 per cent said their children watch it regularly. In its annual survey, the market research firm Smarty Pants named YouTube "the 1 kid brand," calling it "the most powerful brand in kids' lives.

Researchers say YouTube's algorithms for recommending content cause particular problems for children because they often sit in front of a screen for long stretches, watching whatever plays automatically.

The content, however inappropriate in the view of parents, can mesmerise children lacking the maturity to turn away from words or images that may be unhealthy. Problems can be especially severe when children search for such popular and seemingly innocuous terms as "Spiderman," "superhero" and "Elsa.

Critics also say the problem is not visibly improving. Typing "Peppa" into YouTube's search engine, for example, generated at least one recommended video that the researchers classified as "disturbing" 70 per cent of the time, according to research published in January based on reviews of 2, YouTube videos. Researchers querying such keywords found clips showing a scantily clad Elsa straddling another partially undressed cartoon character, Spider-Man wriggling beneath the sheets with a life-size Rapunzel doll, and Peppa encountering a monster with knives for hands who, amid profanity-laced dialogue, slices open the heads of various characters in bloody attacks.

YouTube said that none of those videos violated its policies and that none appeared on the YouTube Kids app; the one featuring a violent monster carried age restrictions. The researchers also found that children have a 45 per cent chance of seeing at least one "disturbing" or otherwise inappropriate video within 10 clips - a stretch that often amounts to an hour or so of viewing.

Some videos were crude animations, some featured actors wearing costumes. Still others, including the marijuana-themed Peppa video, were actual children's videos that had been doctored with new words and images spliced into the original. Peppa Pig, a British animated television series for preschoolers that debuted in and has a global audience, is a particularly popular target for those who make fake alternative versions of original episodes or entirely new videos based on characters from the show.

Entertainment One, which produces Peppa, declined to comment. Graduate students at the Cyprus University of Technology who reviewed the videos featuring several characters popular with young children found that the fake videos often contained violence, profanity and sexually charged behavior. It's definitely happening more often than I'm comfortable with," said Jeremy Blackburn, a computer science professor at the University of Alabama, Birmingham who co-authored the study and has three children.

Biologist Leonore Reiser, who lives in Oakland, California, said her 9-year-old recently reported seeing videos on YouTube that concerned her. When she checked his viewing history, Reiser found one labeled "Two girls want to expletive one man. It depicted two young women in bathing suits competing for the attention of a man at a public pool and, elsewhere in the video, the same man fondling one of the women's breasts.

Reiser later determined that her son, during a period when he was viewing YouTube unattended, had searched for profanity, causing the video to surface as a recommended option. When The Washington Post reviewed the clip, the next set of videos that YouTube's algorithm recommended - on a panel running down the right side of the screen, under the heading "Up next" - included numerous scenes of sexual intimacy that stopped short of nudity.

Most were not restricted by age. While unhappy about the content her son found while searching with profanity, Reiser was more upset about the lyrics in the rap videos that often accompany the football and basketball highlight videos that her son frequently watches. YouTube long has vowed to clean up inappropriate content on its platform amid a succession of controversies involving violent extremism, hateful online conspiracies and troubling children's content.

The company has hired thousands of human moderators and sought to refine its artificial-intelligence systems to better identify and remove videos that violate its community guidelines. But the efforts have failed when it comes to many types of objectionable content, say researchers, parents and advocates. In just the past few weeks, a video blogger has documented how pedophiles scour videos of children and share time stamps of sexually suggestive moments, in what the blogger compared to a "soft-core pedophilia ring," and a Florida pediatrician found that a clip explaining how to commit suicide had been spliced into children's videos on YouTube and YouTube Kids.

Such controversies have prompted some advertisers to abandon YouTube, as Disney, Nestle and others recently did. But the persistence of the problems have made some of those who study the platform conclude that YouTube's almost incomprehensible scale - with hours of new video uploaded every minute - has made it so difficult to police that parents should keep their kids from watching it. One prominent critic, artist and author, James Bridle, who used a Medium post in November to highlight troubling content YouTube was delivering to children, said the fixes attempted by the company have not worked.

The algorithm continues to identify and deliver the most extreme and stimulating content. Now imagine that applied to small children. YouTube's recommendation algorithm uses machine learning and artificial intelligence to study what users are viewing and to suggest other videos to them.

With autoplay activated, as it is by default on YouTube, the service will continue delivering videos with similar themes and features indefinitely. Former YouTube engineer Guillaume Chaslot, who left the company in and now runs the watchdog group AlgoTransparency, said YouTube will struggle to curb the flow of inappropriate content as long as its artificial intelligence is trained to maximise "watch time," a metric that fuels the company's advertising-driven business model.

The result, he said, is a recommendation system that prioritises stimulating users above all else. In the recent scandal involving apparent pedophiles listing the time stamps of provocative images in video comment sections, Chaslot said the recommendation algorithm helped spread such content.

So the best artificial intelligence in the world is doing that. Federal privacy law also complicates the picture. Sites aimed at children younger than 13 are forbidden from gathering most types of personal data on users without parental permission. YouTube, which relies on data-driven advertising for its profits, avoids that restriction by maintaining that the service shouldn't be used by children at all. If YouTube explicitly recommended certain content for children, that position would become untenable, Chaslot said.

Several consumer groups last year filed a complaint with federal regulators alleging that YouTube is aware that children regularly use the site, despite its policies, and as a result routinely violates the privacy law. Chaslot argued that to make YouTube safer for children, the company needs to prioritise something other than "watch time," ideally a system in which parents rate videos based on educational value or appropriateness rather than simply what children click on or watch automatically on autoplay.

YouTube says it has reduced the emphasis on "watch time" in its recommendation algorithm. Pediatrician Jenny Radesky, a University of Michigan researcher who examines the effects of advanced technology on children and families, said kids struggle to understand some of the videos, particularly when characters they admire act in inappropriate ways. This challenges their emerging sense of right and wrong and causes confusion.

It may also cause some kids to imitate the behaviours they see, Radesky said. And the strain on parents is serious as they seek to protect their children from troubling images and messages that always are as close as the nearest mobile device. Many parents say they find constant monitoring impractical and content controls limited and difficult to use. It's putting so much of the onus on parents to be monitoring and watching with kids," Radesky said.

Edward Markey, D-Mass. He favours requiring labeling for age appropriateness of videos to help families make viewing decisions and wants to scrutinise the use of design features, such as autoplay, that encourage especially heavy or even compulsive consumption. So we have to solve the problem. We have to put rules in place to protect kids.

Reiser, the Oakland mother whose 9-year-old found inappropriate videos, said her recent discoveries have made her want to block the platform from her home. I deleted it from the TV, and I'm deleting it from my iPhone. Talanoa, Voices of the Pacific Te Rito. Washington Post.

There is a doctored video of British children's cartoon character Peppa Pig on YouTube with explosions and racial and homophobic slurs. Researchers querying such keywords found clips showing a scantily clad Elsa straddling another partially undressed cartoon character. YouTube has hired thousands of human moderators and sought to refine its artificial-intelligence systems to better identify and remove videos that violate its community guidelines.


VIDEO PREMIERE! 'Galaxy Noise' by Only Drugs

Pitts, Russ. Published by Flying Saucer Media, Seller: moluna , Greven, Germany Contact seller. Seller Rating:. Print on Demand. New - Softcover Condition: New.

Say No to Drugs Cartoon - Educational Videos for Students - Watch Cartoons Online - Drug A.

Health Videos

By Todd Ciolek. The TV cartoons of our childhood lived by a few simple rules: toy lines must be pushed, animation must be cheap, and, to keep the parents happy, lessons must be learned. And in Nancy Reagan? Nearly every cartoon in the nation dedicated an episode to exploring the evils of controlled substances, all overseen by producers who were likely doing blow off of Care Bears cels. Hey, it was the? This made for some of the most hypocritical, heavy-handed moralizing ever animated, but there were deeper forces at work. For the stranger shows, it was a challenge to work cocaine symbolism into The Biskitts or Mighty Orbots. For the writers frustrated with the nobody-ever-dies precept of American cartoons, it was an opportunity to finally kill someone off.

Animated Video: Health of the Healthcare System

drugs cartoon video

R min Crime, Drama. PG 98 min Comedy. A valedictorian and a juvenile delinquent have their school records switched and begin to be treated like the other. PG 66 min Crime, Drama.

Adolescence and emerging adulthood are periods of significant brain growth and development.

YouTube Kids shows videos promoting drug culture and firearms to toddlers

A former ballet dancer, she knew the importance of holding an audience's attention. Organizations in Guatemala and Sri Lanka have used it as a teaching tool, as have the U. Air Force and advocates in the deaf community. Those travelers then infect their family members and neighbors. The video ends with health care workers demonstrating how to prevent the spread of the virus.

Animated music videos: 29 awesome examples

Nuggets is a animated short film and YouTube video created by German animation director Andreas Hykade. Nuggets follows a kiwi who encounters nuggets as it walks down a path. The short portrays the nugget as a metaphor for some sort of drug, with the story itself functioning as an allegory for the stages of addiction. The video was covered by media outlets internationally. The video achieved virality on YouTube, receiving over 3 million views in a little over a month following its release. In an April interview with the German-based newspaper Stuttgarter Nachrichten , Hykade stated that half of a second part to Nuggets was funded.

Stream CARTOON DRUGS by:(: ФЅЄДИЩФЯLↁ:): on desktop and mobile. menu x cvrsed - swag on the beat ##swag MUSIC VIDEO IN DESC.

ДЛЯ ПОДТВЕРЖДЕНИЯ, ЧТО ВЫ СТАРШЕ 18-ТИ, ПОЖАЛУЙСТА, АВТОРИЗИРУЙТЕСЬ ЧЕРЕЗ ВК

A database of evidence-based alcohol and drug education resources. By default results that have the highest evidence rating, and are Australia-based, are displayed first. A collection of real stories from Aboriginal and Torres Strait Islander people about the effects of smoking.

I want video like this. Quote Similar Video For Me. This animated video is created for doctors and tells about the office visit as a healing experience. With modern medicine , some of that healing touch has gone by the wayside. Doctors must also practice the art of healing, with open communication and relationship-building with patients. A positive patient experience improves health outcomes, reduces lawsuits, and boosts your bottom line.

Hey, kiddo.

Please fill out the form below. Scholastic does not retain this information. Our Privacy Policy is available for your review. Please enter a valid date or select the date using the date picker. You are now allowed to access the teachers portal. Quitting drugs is challenging because addiction changes the brain.

More than , products make your work easier. More than k products. Blue technology anti-drug day stay away from drugs title template.

Comments: 4
Thanks! Your comment will appear after verification.
Add a comment

  1. Fenrizshura

    As the expert, I can assist. Together we can come to a right answer.

  2. Chuchip

    It is remarkable, it is a valuable piece

  3. Arasida

    You are not right. I propose to discuss it.

  4. Geldersman

    I think you are wrong. I'm sure. I can defend my position. Email me at PM, we'll talk.

+