Neural Structured Learning(Regression)

Santhosh
2 min readApr 8, 2020

--

Neural structured learning is a framework used for training neural networks with structured signals. This can be applied to NLP, vision or any prediction problem in general(classification, Regression).

The structure can be explicitly given (like knowledge graphs in NLP) or structure can be generated on the fly while training (like creating adversarial examples with perturbations in the data).

Why is this helpful?

  • This helps to achieve better accuracy when labeled data samples are really small
  • More robust models(because the goal of adversarial examples generated is to confuse the model to predict wrong.)
  • The models will be invariant to slight variations in input data

The high-Level idea of NSL

  • Create any Neural network model
  • for each batch of data
  • calculate the regular loss (cross-entropy or mean squared error)
  • perturb the batch based on the gradient direction(called adversarial examples of original data)
  • calculate the adversarial loss passing the perturbed examples to the same model
  • loss = regular loss + adversarial loss
  • optimize the overall loss

Applying NSL to Regression Models

Create a simple Regression model

Regular loss

Generate adversarial examples by making perturbations based on current gradient value

Calculate adversarial loss

Training

While combing both losses it's better to take a weighted sum of losses(regular_loss + 0.2*adverserail_loss)

The adversarial loss is not very different from regular loss as the rate of perturbation is less(play around with adv_step_size in the config)

If you have categorical variables in your data and don't want the perturb them use feature mask .(https://github.com/tensorflow/neural-structured-learning/issues/50)

--

--