TensorFlow Lattice ensures your machine learning models follow global trends

Google’s TensorFlow team released TensorFlow Lattice today to help developers ensure that their machine learning models adhere to global trends even when training data is noisy. Lattice draws from the concept of lookup tables to simplify the process of defining macro rules to restrict models.

A lookup table is a representation of data that includes inputs (keys) and outputs (values). It’s easiest to conceptualize with a single key linking to a single output, but there can be multiple keys in the case of more complex multi-dimensional functions. Roughly speaking, the TensorFlow team’s approach is to train the lookup table values using training data to maximize accuracy given constraints.

Operating this way offers a number of benefits. As previously mentioned, it makes it easier to define monotonic relationships. This is really just a fancy way of saying that it allows developers to ensure that as inputs move in a single direction, outputs move in the same direction.

The team gives the example of cars and traffic — more cars results in more traffic. In a situation like this, monotonicity is being represented as constraints on the lookup table parameters. These constraints utilize prior knowledge to improve outcomes, particularly when models are applied to similar, yet unique, problems.

Additionally, computation is expensive and sometimes it’s more efficient to use a reference table and estimate (interpolate) between missing values rather than compute for every input/output pair. Having a lattice table also means that developers have access to greater transparency than alternative approaches traditionally offer.

TensorFlow is offering four estimators to help developers solve different types of problems with lattice tables. You can find additional information on GitHub.