轻狂侠客
开发

情人节对象不够?AI来凑!

by Marlene, 2022-02-14


引言

这不到了情人节,还没有对象一起过节。就拉着AI来过。
请输入图片描述
最开始用10000(1万)轮训练,我差点以为AI差点跟我说“你想吃peach”
请输入图片描述
后来给她加个0,才听话。
请输入图片描述
废话不多说,看正文。

主要思路

多项式f(x)=13cosx-5cos2x-2cos3x-cos4x

不同于前一篇,这次来个参数方程,但我们只拟合y

输入参数为[cosx,cos2x,cos3x,cos4x]

需要拟合的参数为[13,-5,-2,-1]

所以不需要激活层,只要一个线性层

验证采用留一法

一共训练100000轮,只能说拟合得太慢

详细代码

#多项式f(x)=13cosx-5cos2x-2cos3x-cos4x
import torch
import numpy as np
import random
import matplotlib.pyplot as plt
t = torch.linspace(-15,15,1000)#生成-15到15的1000个数构成的等差数列
x = 16*torch.sin(t)**3
y = 13*torch.cos(t)-5*torch.cos(2*t)-2*torch.cos(3*t)-torch.cos(4*t)
plt.scatter(x.data.numpy(),y.data.numpy())
plt.show()
 
def y_features(t):
    #[cosx,cos2x,cos3x,cos4x]
    t = t.unsqueeze(1)
    return torch.cat([torch.cos(t * i) for i in range(1,5)],1)
def x_features(t):
    t = t.unsqueeze(1)
    return 16*torch.sin(t)**3
t_weights = torch.Tensor([13,-5,-2,-1]).unsqueeze(1)
def target(t):
    return t.mm(t_weights) #矩阵相乘
#随机生成训练数据
def get_batch_data(batch_size):
    batch_x = torch.randn(batch_size)
    #print(batch_x)
    features_x = x_features(batch_x)
    features_y = y_features(batch_x)
    target_x = features_x
    target_y = target(features_y)
    return target_x,features_y,target_y
#建立模型
class PolynomiaRegression(torch.nn.Module):
    def __init__(self):
        super(PolynomiaRegression,self).__init__()
        self.poly = torch.nn.Linear(4,1)
    def forward(self,t):
        return self.poly(t)
#开始训练
import math
epochs = 100000
batch_size = 32
model =PolynomiaRegression()
criterion = torch.nn.MSELoss()
optimizer = torch.optim.SGD(model.parameters(),0.001)
loss_value = np.inf
loss_holder = []
step = 0
for epoch in range(epochs):
    target_x,batch_x,batch_y = get_batch_data(batch_size)
    out = model(batch_x)
    loss = criterion(out,batch_y)
    optimizer.zero_grad()
    loss.backward()
    optimizer.step()
    if(loss<loss_value):
        torch.save(model,'model.ckpt') #存模型
        loss_value = loss
    if(epoch%100==0):
        step+=1
        loss_holder.append([step,math.sqrt(loss/batch_size)])
        if(epoch%1000==0):
            print("Epoch:[{}/{}],loss:[{:.6f}]".format(epoch+1,epochs,loss.item()))
            if(epoch%10000==0):
                predict = model(y_features(t))
                plt.plot(x.data.numpy(),predict.squeeze(1).data.numpy(),"r")
                loss = criterion(predict,y.unsqueeze(1))
                plt.title("Loss:{:.4f}".format(loss.item()))
                plt.xlabel("X")
                plt.ylabel("Y")
                plt.scatter(x,y)
                plt.show()

结果

请输入图片描述

从运行结果可以看到98100轮后(9810*10)基本稳定。但从测试结果看出100000轮都还没过拟合。。。看来还需要多几轮训练。

Marlene

作者: Marlene

2024 © MarleneJ & 少轻狂