Instagram has added video, just a few short weeks after I wrote about the video sharing app Vine. Guess the folks at Instagram saw some of their app market share shrinking and needed to jump on the video bandwagon. Or maybe this had been their plan all along. Either way, what does this mean for parents of Instagrammers?
What should parents know about video on Instagram?
The basics of Instagram are still the same. You take a picture or upload a picture, add a filter (or not), then share with your Instagram followers. If the Instagram account is connected to social media sites, the photo can be shared on Facebook, Twitter, or Tumblr.
These basics now apply to video. You take a video from within the app, add a filter (or not) and then share. As with photos, videos can be geotagged with a location, and can be viewed on an Instagram web profile.
Here are a few more key points:
If your child’s photos are private, their videos are private too
The privacy settings for an account apply to all content – photos and videos. [Learn how to make Instagram account private]. Keep in mind however that a user’s PROFILE is always public, so screen your child’s profile to make sure it does not include any personal information. For example you might have them remove their last name, or school if listed.
Your child can view videos that have been shared publicly
All Instagram users can access the “Explore” tab in their Instagram app. Here, your child could stumble upon any publicly shared photo or video. They can search videos or photos by “hashtag” like #sunsets for pretty pictures of sunsets, or #video to find videos.
Instagram is not designed for the 13 and under crowd
Instagram is rated 12+ in the app store (and Vine is rated 17+). Instagram seems less tolerant of inappropriate content as mentioned in their terms:
“While we respect the artistic integrity of photos and videos, we have to keep our product and the content within it in line with our App Store’s rating for nudity and mature content. In other words, please do not post nudity or mature content of any kind.”
Your child could come across content you deem inappropriate
By clicking on friend’s profiles or perusing the “Explore” tab, your child could encounter nudity or other mature content despite the app’s terms. This could be shocking or downright scary, depending on the age of your child and the photo or video in question. Here is a warning I saw when clicking the hashtagged word “suicidal” that sadly seems to be a frequent hashtag. Even with the warning, if you click “Show Posts” you will see some disturbing imagery.
Mature content can be reported. Click on three little dots below a photo or video and then choose “Report Inappropriate”.
Is your child using Instagram? Have they upgraded the app to the new version with video? If so keep these points in mind during your next internet safety discussion (you’re having them, right?!?) Instagram can be a fun way to explore photography and video, for the right age group, with the right supervision and expectations. Let’s see those #sunsets!
Other Instagram articles on Be Web Smart: