I made a calendar that automatically updates the distribution schedule of Vtuber

Introduction

Nice to meet you. My name is Shun and I am working hard as an engineer. I made a calendar that automatically updates the distribution schedule of Vtuber, so I would like to write the process. This time, we will use the YouTube Data API to get the video information of the YouTube channel.

environment

Data to prepare

--YouTube Channel ID

Data to be acquired

--Channel information --Video information

Method

First of all, register your application with Google so that you can use the YouTube Data API. For details, see here. (By the way, you can also try the API from this page.) We will use the API key obtained here to acquire video information.

The acquisition flow looks like this.

  1. Get the video ID of the channel from the channel ID
  2. Get video information from video ID

1. Get the video ID of the channel from the channel ID

You can get the channel ID from the URL when you opened the home of the channel you want to know. For example, in the case of my favorite Akua Minato, https://www.youtube.com/channel/UC1opHUrw8rvnsadT-iGp7Cg Since this is the URL of the YouTube channel, the channel ID will be ** UC1opHUrw8rvnsadT-iGp7Cg **.

Use this channel ID to get a list of videos.

import urllib.request
import json
import ssl

context = ssl.SSLContext(ssl.PROTOCOL_TLSv1)

def get_video_info(channel_id, page_token=None, published_after=None):
    url = 'https://www.googleapis.com/youtube/v3/search'
    params = {
        'key': 'YOUTUBE_API_KEY',
        'part': 'id',
        'channelId': channel_id,
        'maxResults': 50,
        'order': 'date'
    }
    if page_token is not None:
        params['pageToken'] = page_token
    if published_after is not None:
        params['publishedAfter'] = published_after
    req = urllib.request.Request('{}?{}'.format(url, urllib.parse.urlencode(params)))
    with urllib.request.urlopen(req, context=context) as res:
        body = json.load(res)
        return body

In addition to the channel ID, specify the following values as parameters.

2. Get video information from video ID

Use the acquired video ID to acquire detailed information for each video.

def get_video_details(video_ids):
    url = 'https://www.googleapis.com/youtube/v3/videos'
    params = {
        'key': 'YOUTUBE_API_KEY',
        'part': 'snippet, liveStreamingDetails',
        'id': video_ids
    }

    req = urllib.request.Request('{}?{}'.format(url, urllib.parse.urlencode(params)))
    with urllib.request.urlopen(req, context=context) as res:
        body = json.load(res)
        return body

In addition to newly specifying the video ID as id in the parameter, snippet and liveStreamingDetails are specified in part. With this, in addition to the basic information of the video, you can also get the information at the time of live distribution.

def get_videos(items):
    video_ids = ''
    for item in items:
        if 'videoId' in item['id']:
            video_ids += item['id']['videoId']
            video_ids += ', '
    video_details = get_video_details(video_ids[:-2])
    for video_detail in video_details['items']:
        print(video_detail)

Since the playlist ID is also included in the list of video IDs acquired first, only those with videoId are collected before acquiring video information.

The code that executes these together is ↓.

video_info = get_video_info(channel_id='CHANNEL_ID', published_after='DATETIME')
get_videos(video_info['items'])
while 'nextPageToken' in video_info:
    page_token = video_info['nextPageToken']
    video_info = get_video_info(channel_id='CHANNEL_ID', page_token=page_token)
    get_videos(video_info['items'])

First, specify the date and time to get it, and then continue to get it as long as there is nextPageToken. In the case of Minato Akua Channel, the channel was established on July 31, 2018, so if you start acquiring it at an older date and time, you will get information on all videos. I can do it.

Actually, I will continue to update the calendar after this, but that will be another opportunity ....

Recommended Posts

I made a calendar that automatically updates the distribution schedule of Vtuber
I made a calendar that automatically updates the distribution schedule of Vtuber (Google Calendar edition)
I made a slack bot that notifies me of the temperature
I made a program that automatically calculates the zodiac with tkinter
I made a tool to automatically back up the metadata of the Salesforce organization
I made a github action that notifies Slack of the visual regression test
[Python] I made a web scraping code that automatically acquires the news title and URL of Nikkei Inc.
I made a twitter app that decodes the characters of Pricone with heroku (failure)
[Python / C] I made a device that wirelessly scrolls the screen of a PC remotely.
A tool that automatically turns the gacha of a social game
I made a function to check the model of DCGAN
I made a dot picture of the image of Irasutoya. (part1)
I made a dot picture of the image of Irasutoya. (part2)
I want a Slack bot that calculates and tells me the salary of a part-time job from the schedule of Google Calendar!
I made a program that solves the spot the difference in seconds
I investigated the X-means method that automatically estimates the number of clusters
I made a command to display a colorful calendar in the terminal
[Kaggle] I made a collection of questions using the Titanic tutorial
I made a Line bot that guesses the gender and age of a person from an image
I wrote a corpus reader that reads the results of MeCab analysis
The story of developing a web application that automatically generates catchphrases [MeCab]
I made a simple timer that can be started from the terminal
I made a GAN with Keras, so I made a video of the learning process.
I made a library konoha that switches the tokenizer to a nice feeling
I made a program to check the size of a file in Python
I made a mistake in fetching the hierarchy with MultiIndex of pandas
[Python] Note: A self-made function that finds the area of the normal distribution
I made a function to see the movement of a two-dimensional array (Python)
I made a tool to estimate the execution time of cron (+ PyPI debut)
I made a LINE BOT that returns a terrorist image using the Flickr API
The story of IPv6 address that I want to keep at a minimum
I made an appdo command to execute a command in the context of the app
I made a SlackBot that notifies me of AtCoder contest information every week
With LINEBot, I made an app that informs me of the "bus time"
I made a Linebot that notifies me of nearby evacuation sites on AWS
[Python] A program that calculates the number of updates of the highest and lowest records
[Discode Bot] I created a bot that tells me the race value of Pokemon
I made a system that automatically decides whether to run tomorrow with Python and adds it to Google Calendar.
The story of making a Line Bot that tells us the schedule of competitive programming
I made a script to record the active window using win32gui of Python
I made a Docker Image that reads RSS and automatically tweets regularly and released it.
The story of Linux that I want to teach myself half a year ago
[Python] A program that counts the number of valleys
I made a threshold change box of Pepper's Dialog
Make a BOT that shortens the URL of Discord
I made a VM that runs OpenCV for Python
# Function that returns the character code of a string
Steps to calculate the likelihood of a normal distribution
I tried to visualize the spacha information of VTuber
Generate that shape of the bottom of a PET bottle
A memo that I touched the Datastore with python
I made a command to markdown the table clipboard
A story that analyzed the delivery of Nico Nama.
[Python] A program that compares the positions of kangaroos.
Clustering G-means that automatically determines the number of clusters
〇✕ I made a game
I made a twitter app that identifies and saves the image of a specific character on the twitter timeline by pytorch transfer learning
I made a program in Python that changes the 1-minute data of FX to an arbitrary time frame (1 hour frame, etc.)
I tried to automatically generate OGP of a blog made with Hugo with tcardgen made by Go
I made a tool to get the answer links of OpenAI Gym all at once
I made an API with Docker that returns the predicted value of the machine learning model