Skip to content

Multi-task learning using uncertainty to weigh losses for scene geometry and semantics, Auxiliary Tasks in Multi-task Learning

License

Notifications You must be signed in to change notification settings

Mikoto10032/AutomaticWeightedLoss

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 

Repository files navigation

AutomaticWeightedLoss

A PyTorch implementation of Liebel L, Körner M. Auxiliary tasks in multi-task learning[J]. arXiv preprint arXiv:1805.06334, 2018.

The above paper improves the paper "Multi-task learning using uncertainty to weigh losses for scene geometry and semantics" to avoid the loss of becoming negative during training.

Requirements

  • Python
  • PyTorch

How to Train with Your Model

  • Clone the repository
git clone git@github.com:Mikoto10032/AutomaticWeightedLoss.git
  • Create an AutomaticWeightedLoss module
from AutomaticWeightedLoss import AutomaticWeightedLoss

awl = AutomaticWeightedLoss(2)	# we have 2 losses
loss1 = 1
loss2 = 2
loss_sum = awl(loss1, loss2)
  • Create an optimizer to learn weight coefficients
from torch import optim

model = Model()
optimizer = optim.Adam([
                {'params': model.parameters()},
                {'params': awl.parameters(), 'weight_decay': 0}	
            ])
  • A complete example
from torch import optim
from AutomaticWeightedLoss import AutomaticWeightedLoss

model = Model()

awl = AutomaticWeightedLoss(2)	# we have 2 losses
loss_1 = ...
loss_2 = ...

# learnable parameters
optimizer = optim.Adam([
                {'params': model.parameters()},
                {'params': awl.parameters(), 'weight_decay': 0}
            ])

for i in range(epoch):
    for data, label1, label2 in data_loader:
        # forward
        pred1, pred2 = Model(data)	
        # calculate losses
        loss1 = loss_1(pred1, label1)
        loss2 = loss_2(pred2, label2)
        # weigh losses
        loss_sum = awl(loss1, loss2)
        # backward
        optimizer.zero_grad()
        loss_sum.backward()
        optimizer.step()

Something to Say

Actually, it is not always effective, but I hope it can help you.

About

Multi-task learning using uncertainty to weigh losses for scene geometry and semantics, Auxiliary Tasks in Multi-task Learning

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages