Learn new data with PaintsChainer

The source of PaintsChainer is published at http://qiita.com/taizan/items/7119e16064cc11500f32, but there are many articles and blogs that I tried to move PaintsChainer using a trained model, but I learned by myself. I couldn't find anyone who tried to do it, so I tried it.

Addendum 20170404: The calculation of electricity bill was overcalculated by one digit. .. It wasn't that expensive. ..

Preparation of learning data

Preparation of original image

First of all, you have to prepare the learning data. It is difficult to prepare a large number of images, and it can not be helped to do it with the same Pixiv as the original article, so here, [certain animation](https://ja.wikipedia.org/wiki/%E6%B6% BC% E5% AE% AE% E3% 83% 8F% E3% 83% AB% E3% 83% 92% E3% 82% B7% E3% 83% AA% E3% 83% BC% E3% 82% BA) I tried using the video of. The MP4 video I had is converted into a full frame image. For the time being, I made images of episodes 1 to 6 of the initial work. All images are made regardless of the opening, ending, and scenes. There are 172,729 images in total.

Use ffmpeg for imaging. I placed the generated images in cgi-bin / paint_x2_unet / images / original in the directory where I checked out PaintChainer.

$ cd cgi-bin/paint_x2_unet
$ mkdir -p images/original
$ ffmpeg -i movies/movie_01.mp4 -f image2 images/original/movie_01_%d.jpg

I will do this for 6 episodes. Please put the whole story in images / original / so that the file name is not covered.

Resize / line art

PaintChainer requires 128x128 images and 512x512 images. I wrote a script to resize properly.

128x128 https://github.com/ikeyasu/PaintsChainer/blob/ikeyasu_mod/cgi-bin/paint_x2_unet/tools/resize.py

512x512 https://github.com/ikeyasu/PaintsChainer/blob/ikeyasu_mod/cgi-bin/paint_x2_unet/tools/resizex2.py

Also, for line art, I referred to k3nt0's blog.

https://github.com/ikeyasu/PaintsChainer/blob/ikeyasu_mod/cgi-bin/paint_x2_unet/tools/image2line.py

I run the above on the image I just extracted, but it takes a lot of time. Therefore, parallelize using Gnu parallel.

I wrote the following script.

cgi-bin/paint_x2_unet/run.sh:

ls -v1 ../images/original/ | parallel -j 8  'echo {}; python resize.py -i {} -o ../images/color/{}'
ls -v1 ../images/original/  | parallel -j 8  'echo {}; python image2line.py -i {} -o ../images/line/{}'
ls -v1 ../images/original/  | parallel -j 8  'echo {}; python resizex2.py -i {} -o ../images/colorx2/{}'
ls -v1 ../images/original/  | parallel -j 8  'echo {}; python image2line.py -i {} -o ../images/linex2/{}'
$ cd cgi-bin/paint_x2_unet
$ cd tools
$ ./run.sh

It also stores a list of datasets in dat / images_color_train.dat.

$ pwd
~/PaintsChainer/cgi-bin/paint_x2_unet/tools
$ cd ../images/original
$ ls -v1 > ../../dat/images_color_train.dat

Learning

All you have to do is learn. I've tinkered with the original code a bit. (It's a bit old because it's based on the Paint Chainer code when you start learning)

https://github.com/ikeyasu/PaintsChainer/commit/8e30ee6933c747580efe25c9c4d5165f55823966

$ pwd
~/PaintsChainer/cgi-bin/paint_x2_unet/images/original
$ cd ../../
$ python train_128.py -g 0 --dataset images/ -e 20 -o result1
$ cp result1/model_final models/model_cnn_128
$ python train_x2.py -g 0 -o result2/ --dataset images/ --snapshot_interval 5000 -e 20

Try to run

The model to load is written in cgi-bin / paint_x2_unet / cgi_exe.py.

serializers.load_npz(
            "./cgi-bin/paint_x2_unet/models/unet_128_standard", self.cnn_128)

When

serializers.load_npz(
            "./cgi-bin/paint_x2_unet/models/unet_512_standard", self.cnn)

It is the part of. Copy the model accordingly

$ pwd
~/PaintsChainer/cgi-bin/paint_x2_unet
$ cp result1/model_final models/unet_128_standard
$ cp result2/model_final models/unet_512_standard

Then run server.py

$ pwd
~/PaintsChainer/cgi-bin/paint_x2_unet
$ cd ../../
$ python server.py

You can see PaintChainer by opening http: // localhost: 8000 in your browser. If you want to see it from another PC, specify the IP of the running host, such as python server.py --host 192.168.1.3.

What are the results you care about?

There is no copyright in the creation of artificial intelligence However, since this is colored, the line art part has copyright, so here is the result You can't put it on the barn. .. I will only quote a part of the screen.

The color of the hair and eyes is beautifully painted.

スクリーンショット 2017-02-18 18.46.12.pngスクリーンショット2017-02-1818.46.19.png 涼宮ハルヒの憂鬱 I 第5話より引用

Uniform スクリーンショット 2017-02-18 18.47.43.pngスクリーンショット2017-02-1818.47.55.png涼宮ハルヒの憂鬱 I 第5話より引用

Men too スクリーンショット 2017-02-18 18.47.58.pngスクリーンショット2017-02-1818.48.03.png涼宮ハルヒの憂鬱 I 第5話より引用

Task

I posted the results that worked, but even if I put the handwritten line art of the fan illustration, it doesn't work at all. This may be because the line art method was not good. After all, artificial intelligence has to give good teaching materials. ..

http://d.hatena.ne.jp/zuruo/20080528/1212143328 スクリーンショット 2017-02-18 18.54.19.png(頭部のリボン部分)

Also, even if you give a hint of color, which is a feature of PaintChainer, it does not paint very well. What is this? After all, is the learning data not good?

Caution! !! It will take a lot of time!

I wrote it as if I could try it quickly, but it took 294 hours (12 days and 6 hours!) To execute so far. Original article says that it is only the first stage. ..

Assuming that the electricity bill of the PC is 200W, the electricity bill is

200 * (51 + 243) * 0.026 = 1,528.8 yen

Here, 0.026 can be written because TEPCO's electricity bill is because 1kw / h is 26 yen.

Also, the PC is a self-made PC with GTX1080, and it costs about 170,000 yen.

Reference: Assemble a cube-type PC with GTX 1080

Recommended Posts

Learn new data with PaintsChainer
Learn data distributed with TensorFlow Y = 2X
Learn data science
Data analysis with python 2
Learn Zundokokiyoshi with LSTM
Visualize data with Streamlit
Learn Pandas with Cheminformatics
Reading data with TensorFlow
Learn with chemoinformatics scikit-learn
Data visualization with pandas
Learn with Cheminformatics Matplotlib
Data manipulation with Pandas!
Shuffle data with pandas
Data Augmentation with openCV
Learn with Cheminformatics NumPy
Normarize data with Scipy
DCGAN with TF Learn
Data analysis with Python
LOAD DATA with PyMysql
Learn Pendulum-v0 with DDPG
Sample data created with python
Embed audio data with Jupyter
Graph Excel data with matplotlib (1)
Learn librosa with a tutorial 1
Extract Twitter data with CSV
Get Youtube data with python
Learn elliptical orbits with Chainer
Binarize photo data with OpenCV
Graph Excel data with matplotlib (2)
Save tweet data with Django
New Data Augmentation? [Grid Mix]
Data processing tips with Pandas
Interpolate 2D data with scipy.interpolate.griddata
Read json data with python
[Stock price analysis] Learn pandas with Nikkei 225 (004: Change read data to Nikkei 225)
Practical exercise of data analysis with Python ~ 2016 New Coder Survey Edition ~
Data engineers learn DevOps with a view to MLOps. ① Getting started
Learn algorithms with Go @ recursive call
Save & load data with joblib, pickle
DCGAN with TF Learn
Learn with Causal ML Package Meta-Learner
Learn with FizzBuzz Iterator, Generator, Decorator
Learn with PyTorch Graph Convolutional Networks
[TensorFlow 2] Learn RNN with CTC Loss
Let's learn Deep SEA with Selene
Learn search with Python # 2bit search, permutation search
How to deal with imbalanced data
How to deal with imbalanced data
[Python] Get economic data with DataReader
Versatile data plotting with pandas + matplotlib
Learn document categorization with spaCy CLI
Python data structures learned with chemoinformatics
Install the data files with setup.py
Parse pcap data with tshark command
Dynamically create new dataframes with pandas
Create noise-filled audio data with SoX
How to Data Augmentation with PyTorch
Easy data visualization with Python seaborn.
Generate fake table data with GAN
Process Pubmed .xml data with python
Data analysis starting with python (data visualization 1)