Try to draw a "weather map-like front" by machine learning based on weather data (4)

Try to draw a "weather map-like front" by machine learning based on the weather data (4) I tried to colorize the black and white weather map

In the previous (3rd), Drawing a "weather map-like front" by machine learning based on meteorological data (3) I posted a story about cutting out frontal elements from a "breaking weather map" to create a teacher image.

In this story, the black-and-white version of the "flash weather map" that had been accumulated in the past was used because the color version of the "flash weather map" for cutting out frontal elements was not available for a short period of time (although it is a completely personal matter). I tried to colorize it.

Colorization of the black-and-white version of "Breaking Weather Map"

Why colorization

Recently, I hear about black-and-white photographs and films being colorized by machine learning. I feel that the reality of the past, such as images during the war that had no sense of reality, has increased overwhelmingly, and I feel that it is approaching with a sense of the same era.

The colorization this time has nothing to do with such a story, only the algorithm situation of front element extraction.

In the black-and-white version of the weather map, the front, longitude and latitude lines, map lines, and isobars are all the same black, as a matter of course. Therefore, I couldn't think of a good way to cut out only the front from there, and it became necessary to colorize it.

In fact, what I really did is in chronological order.

(1) As an output image, we first created a neural network that learns the breaking weather map itself as a teacher image without cutting out front elements. The front was not learned well. (2) I noticed the existence of a color version of the "flash weather map" when I was thinking about how to extract front elements in order to narrow down the learning target to the front. (3) I downloaded the available color version of "Breaking Weather Map" and was able to extract frontal images based on the colors. (4) When I trained with a teacher image with only fronts, I found that it would work to some extent, so I decided to increase the number of weather map examples. At this point, the color version was only available for about half a year. (5) Since the neural network I was making around this time generates an image from an image, maybe it is possible to convert a black-and-white weather map to a color weather map? I thought.

It was that. Since the neural network of the image started to move from the image to some extent, I thought that it could be applied.

As a result, this worked better than I expected.

I was able to increase the number of teacher images, which was only about half a year, to more than two years at a time. By getting two rounds of spring, summer, autumn and winter, the variation of teacher images has increased.

Created neural network

Hardware and software environment

Mac mini(2018) Processor 3.2GHz 6-core Intel Core i7 Memory 32GB 2667 MHz DDR4

OS macOS Catalina python3.7 Keras 2.0

We are using.

CNN structure

The network I was creating has a fairly simple structure. With a Sequential structure, a 1-channel grayscale image is a network in which features are extracted by Conv2D, passed through a 4-stage CNN at the bottom layer, and then returned by Conv2D Transpose to a 3-channel color image.

The input image is 256x256, but I reduced the Strides of Conv2D four times, made it 16x16, and then passed Conv2D four more times. After that, it is restored to the original size by Conv2D Transpose.

Since the learning uses mean_squared_error, it is a learning that brings it to the teacher image as a pixel value.

network.py



# parameter settings

num_hidden1 = 32
num_hidden2 = 64
num_hidden3 = 128
num_hidden4 = 64
num_hidden5 = 32
num_hidden6 = 32

######### start of network definition
        
NN_1 = Sequential()

#--- encode start

NN_1.add(Conv2D(num_hidden1, data_format='channels_first', kernel_size=(3,3), strides=(2,2), activation='relu', input_shape=(in_ch, i_dmlat, i_dmlon), padding='same'))
NN_1.add(Conv2D(num_hidden1, data_format='channels_first', kernel_size=(3,3), activation='relu', input_shape=(in_ch, i_dmlat, i_dmlon), padding='same'))
NN_1.add(Conv2D(num_hidden1, data_format='channels_first', kernel_size=(3,3), strides=(2,2), activation='relu', input_shape=(in_ch, i_dmlat, i_dmlon), padding='same'))

NN_1.add(Conv2D(num_hidden2, data_format='channels_first', kernel_size=(3,3), strides=(2,2), activation='relu', padding='same'))
NN_1.add(Conv2D(num_hidden2, data_format='channels_first', kernel_size=(3,3), activation='relu', padding='same'))
NN_1.add(Conv2D(num_hidden2, data_format='channels_first', kernel_size=(3,3), strides=(2,2), activation='relu', padding='same'))

#--- encode out
NN_1.add(Conv2D(num_hidden3, data_format='channels_first', kernel_size=(3,3), activation='relu', padding='same'))
NN_1.add(Conv2D(num_hidden3, data_format='channels_first', kernel_size=(3,3), activation='relu', padding='same'))
NN_1.add(Conv2D(num_hidden3, data_format='channels_first', kernel_size=(3,3), activation='relu', padding='same'))
NN_1.add(Conv2D(num_hidden3, data_format='channels_first', kernel_size=(3,3), activation='relu', padding='same'))

#--- decode start

NN_1.add(Conv2DTranspose(num_hidden4, data_format='channels_first', kernel_size=(3,3), strides=(2,2), activation='relu', padding='same'))
NN_1.add(Conv2DTranspose(num_hidden4, data_format='channels_first', kernel_size=(3,3), strides=(2,2), activation='relu', padding='same'))

NN_1.add(Conv2DTranspose(num_hidden5, data_format='channels_first', kernel_size=(3,3), strides=(2,2), activation='relu', padding='same'))
NN_1.add(Conv2DTranspose(num_hidden5, data_format='channels_first', kernel_size=(3,3), strides=(2,2), activation='relu', padding='same'))

NN_1.add(Conv2D(num_hidden5, data_format='channels_first', kernel_size=(3,3), activation='relu', padding='same'))
NN_1.add(Conv2D(num_hidden6, data_format='channels_first', kernel_size=(3,3), activation='relu', padding='same'))

#--- back to 3 channel

NN_1.add(Conv2D(3, data_format='channels_first', kernel_size=(3,3), activation='relu', padding='same'))

####### end of network definition

# compile network
NN_1.compile(optimizer='adam', loss='mean_squared_error' , metrics=['accuracy'])

# do training
NN_1.fit(np_i_data_train_tr, np_t_data_train_tr, epochs=num_itter , callbacks=cbks, batch_size=8 , validation_split=0.2 )

# Save model and weights
json_string = NN_1.to_json()
open(os.path.join(paramfiledir, 'cnnSPAStoColSPAS2_011_model.json'),'w').write(json_string)

NN_1.save_weights(os.path.join(paramfiledir, 'cnnSPAStoColSPAS2_011_weight.hdf5'))

Learning and conversion

Input and output

The input image is, for example, (flash weather map 2018/9/30 21 UTC). spas.u.2018093021.v8.512.png

The output teacher image is like this. SPAS_COLOR_201809302100.v8.512.png

Learning results

Of the six months' worth of data, I tried to use two months' worth for evaluation, and trained the black-and-white version and the color version using about four months' worth (more than 700 sheets because there are six sheets a day).

The training data will converge to such an image relatively quickly. Maps, longitude / latitude lines, and dates will be overfitted in no time.

out2w_SPAS_COLOR_2018093021.v8.512.png

Prediction result

Using the learned network, I tried to colorize the actual black and white weather map. The right is the original black-and-white weather map, and the left is the colorized weather map. I think that the purpose of cutting out frontal elements based on color is sufficient colorization.

spas_cmp_2017070109.png

Although it is a detailed story, the box with the date and time in the upper left is also the result of learning and predicting as an image. The numbers (atmospheric pressure, moving speed) that appear in the weather map seem to be straightforwardly predicted, "July 2017" is predicted only as "January 2018" as cancer. Since the data used for learning is from 2018 onwards and does not include July, "I can't predict what's not in the input" I feel like saying.

The same thing was converted to Heisei as a cancer for the character string of the reiwa of the generation. It seems that the learning was such that the weight was zero and only the bias remained, so that whatever would come, it would be Heisei.

EF883BE9-9D9F-431C-A83A-8550432756A3.jpeg

play

I tried to mischief by handwriting (laughs). This is my creation. 25DE47A0-A84F-4793-970F-A6AD4989BF5A.jpeg

I tried to draw it for children. AD1C083D-A87A-4DED-9FDB-58FDF19D4A13.jpeg

It seems that I have a longer day to handwrite frontal symbols.

Summary

This time, I summarized the story of colorizing black and white weather maps. As a result, the number of front cropped images has increased significantly, and the accuracy of drawing learning for the final goal of "weather map-like fronts" has improved.

Next time, as the final episode, I will try to draw a "weather map-like front" by machine learning based on a neural network for drawing fronts (5) Machine learning Automatic Front Detection in Weather Data](https: // qiita I will post about .com / m-taque / items / 2788f623365418db4078). As of 2020.2.9, the monthly limit of the image posting capacity on the Qiita site has been reached, and the final episode will be carried over to March.

Recommended Posts

Try to draw a "weather map-like front" by machine learning based on weather data (5)
Try to draw a "weather map-like front" by machine learning based on weather data (3)
Try to draw a "weather map-like front" by machine learning based on weather data (4)
Try to draw a "weather map-like front" by machine learning based on weather data (2)
A story about data analysis by machine learning
Machine learning beginners try to make a decision tree
Try to draw a Bezier curve
How to collect machine learning data
Try to make a blackjack strategy by reinforcement learning ((1) Implementation of blackjack)
Search for technical blogs by machine learning focusing on "easiness to understand"
Try to predict the value of the water level gauge by machine learning using the open data of Data City Sabae
[Machine learning] Create a machine learning model by performing transfer learning with your own data set
Try to draw a life curve with python
Notes on machine learning (updated from time to time)
Try to create a new command on linux
[Keras] I tried to solve a donut-type region classification problem by machine learning [Study]
Try to make a blackjack strategy by reinforcement learning (② Register the environment in gym)
Build a machine learning Python environment on Mac OS
Try to predict forex (FX) with non-deep machine learning
Time series data prediction by AutoML (automatic machine learning)
[Machine learning] Try to detect objects using Selective Search
xgboost: A valid machine learning model for table data
An introduction to machine learning from a simple perceptron
Build a machine learning environment natively on Windows 10 (x64)
How to interactively draw a machine learning pipeline with scikit-learn and save it in HTML
Data visualization with Python-It's too convenient to draw a graph by attribute with "Facet" at once
Try to make a blackjack strategy by reinforcement learning (③ Reinforcement learning in your own OpenAI Gym environment)
Introduction to machine learning
Build a python machine learning study environment on macOS sierra
Build a machine learning environment on mac (pyenv, deeplearning, opencv)
Try to build a deep learning / neural network with scratch
Try to evaluate the performance of machine learning / regression model
Introduction to Machine Learning with scikit-learn-From data acquisition to parameter optimization
Made icrawler easier to use for machine learning data collection
Try to evaluate the performance of machine learning / classification model
Dockerfile for creating a data science environment based on pip3
Machine learning beginners try to reach out to Naive Bayes (2) --Implementation
Create AI to identify Zuckerberg's face by deep learning ③ (Data learning)
Try to predict if tweets will burn with machine learning
Machine learning environment settings based on Python 3 on Mac (coexistence with Python 2)
Machine learning beginners try to reach out to Naive Bayes (1) --Theory
Using open data from Data City Sabae to predict water level gauge values by machine learning Part 2
I was frustrated by Kaggle, so I tried to find a good rental property by scraping & machine learning
How to quickly create a machine learning environment using Jupyter Notebook on macOS Sierra with anaconda