2.5 Loss Function

HW Hao Wu
HC Huyan Chen
XW Xuchao Wang
LY Liheng Yu
ZY Zekuan Yu
ZS Zhijie Shi
JX Jinhua Xu
BD Biqin Dong
SZ Shujin Zhu
request Request a Protocol
ask Ask a question
Favorite

In our work, the cross-entropy loss function (19) is employed for all deep models. The cross-entropy loss is computed as follows:

where yi,c is the one-hot encoding format of ground truth labels, pi,c is the matrix of predicted values for pixels in each class, where the indices, c and i, iterate over all classes and pixels, respectively. Cross-entropy loss is based on minimizing pixel-wise error, where for class imbalanced cases, it may lead to an over-representation for dominant class samples in the loss, resulting in the poor representation or weak contribution for minority samples.

Do you have any questions about this protocol?

Post your question to gather feedback from the community. We will also invite the authors of this article to respond.

post Post a Question
0 Q&A