Factorization Machines

Factorization Machines model all interactions between variables using factorized parameters. The model equation for a factorization machine of degree $d=2$ is defined as: $$ \hat{y}(x) = \omega_0 + \sum_{i=1}^n\omega_ix_i + \sum_{i=1}^n\sum_{j=i+1}^n v_i^Tv_jx_ix_j. $$

The complexity of straight forward computation of the equation is in $\mathcal{O}(kn^2)$. However, we have: \begin{align} \sum_{i=1}^n\sum_{j=i+1}^n v_i^Tv_jx_ix_j &= \frac{1}{2}\sum_{i=1}^n\sum_{j=1}^n v_i^Tv_jx_ix_j - \frac{1}{2}\sum_{i=1}^n v_i^Tv_ix_ix_i\\ &= \frac{1}{2}(\sum_{i=1}^n\sum_{j=1}^n\sum_{f=1}^k v_{if}v_{jf}x_ix_j - \sum_{i=1}^n\sum_{f=1}^k v_{if}v_{if}x_ix_i)\\ &= \frac{1}{2}\sum_{f=1}^k((\sum_{i=1}^n v_{if}x_i)^2 - \sum_{i=1}^n v_{if}^2x_i^2) \end{align} This equation has only linear complexity in both $k$ and $n$, i.e., its computation is in $\mathcal{O}(kn)$.

d-way Factorization Machines

The 2-way FM described so far can easily be generalized to a d-way FM: $$ \hat{y}(x) = \omega_0 + \sum_{i=1}^n\omega_ix_i + \sum_{l=2}^d\sum_{i_1=1}^n\cdot\cdot\cdot\sum_{i_l=i_{l-1}+1}^n(\prod_{j=1}^l x_{i_j})(\sum_{f=1}^{k_l}\prod_{j=1}^l v_{i_jf}^{(l)}). $$

Implementation

In [1]:
from sklearn import metrics
import tensorflow as tf
import numpy as np

# evaluation
def get_auc(y, y_pre):
    fpr, tpr, thresholds = metrics.roc_curve(y.astype(int), y_pre, pos_label=1)
    return metrics.auc(fpr, tpr)

# hyper-parameters
vector_dim = 8
learning_rate = 1e-4
l2_factor = 1e-2
max_training_step = 400
train_rate = 0.8

# split data
data = np.loadtxt(fname='/kaggle/input/data', delimiter='\t')
thredhold = int(train_rate * len(data))
x_train = data[:thredhold, :-1]
y_train = data[:thredhold, -1]
x_test = data[thredhold:, :-1]
y_test = data[thredhold:, -1]
feature_num = len(data[0])-1

# construct graph
# model parameters
w_0 = tf.Variable(0.0)
w = tf.Variable(tf.zeros(shape=[feature_num]))
v = tf.Variable(tf.truncated_normal(shape=[feature_num, vector_dim], mean=0.0, stddev=0.01))

# construct loss
x = tf.placeholder(shape=[None, feature_num], dtype=tf.float32)
y = tf.placeholder(shape=[None], dtype=tf.float32)

linear_term = w_0 + tf.reduce_sum(tf.expand_dims(w, axis=0) * x, axis=1)
square_of_sum = tf.square(tf.reduce_sum(tf.expand_dims(x, axis=2) * tf.expand_dims(v, axis=0), axis=1))
sum_of_square = tf.reduce_sum(tf.square(tf.expand_dims(v, axis=0) * tf.expand_dims(x, axis=2)), axis=1)
y_pre = tf.sigmoid(linear_term + 0.5 * tf.reduce_sum(square_of_sum - sum_of_square, axis=1))

cross_entropy = - y * tf.log(y_pre) - (1 - y) * tf.log(1 - y_pre)
train_op = tf.train.GradientDescentOptimizer(learning_rate).minimize(cross_entropy + l2_factor * tf.add_n([tf.nn.l2_loss(item) for item in [w_0, w, v]]))

accuracy = tf.reduce_mean(tf.cast(tf.less(tf.abs(y - y_pre), 0.5), dtype=tf.float32))

with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    for step in range(max_training_step):
        sess.run(train_op, {x: x_train, y: y_train})
        acc, y_pre_value = sess.run([accuracy, y_pre], {x: x_test, y: y_test})
        print('----step%3d accuracy: %.3f, auc: %.3f' % (step, acc, get_auc(y_test, y_pre_value)))
----step  0 accuracy: 0.700, auc: 0.645
----step  1 accuracy: 0.700, auc: 0.656
----step  2 accuracy: 0.700, auc: 0.663
----step  3 accuracy: 0.700, auc: 0.673
----step  4 accuracy: 0.700, auc: 0.681
----step  5 accuracy: 0.700, auc: 0.688
----step  6 accuracy: 0.700, auc: 0.696
----step  7 accuracy: 0.700, auc: 0.702
----step  8 accuracy: 0.700, auc: 0.708
----step  9 accuracy: 0.700, auc: 0.714
----step 10 accuracy: 0.700, auc: 0.722
----step 11 accuracy: 0.700, auc: 0.726
----step 12 accuracy: 0.700, auc: 0.734
----step 13 accuracy: 0.700, auc: 0.738
----step 14 accuracy: 0.700, auc: 0.741
----step 15 accuracy: 0.700, auc: 0.744
----step 16 accuracy: 0.700, auc: 0.747
----step 17 accuracy: 0.700, auc: 0.750
----step 18 accuracy: 0.700, auc: 0.753
----step 19 accuracy: 0.705, auc: 0.756
----step 20 accuracy: 0.705, auc: 0.757
----step 21 accuracy: 0.700, auc: 0.760
----step 22 accuracy: 0.705, auc: 0.762
----step 23 accuracy: 0.705, auc: 0.764
----step 24 accuracy: 0.705, auc: 0.766
----step 25 accuracy: 0.710, auc: 0.768
----step 26 accuracy: 0.710, auc: 0.771
----step 27 accuracy: 0.710, auc: 0.773
----step 28 accuracy: 0.710, auc: 0.775
----step 29 accuracy: 0.710, auc: 0.777
----step 30 accuracy: 0.710, auc: 0.778
----step 31 accuracy: 0.710, auc: 0.780
----step 32 accuracy: 0.710, auc: 0.780
----step 33 accuracy: 0.710, auc: 0.781
----step 34 accuracy: 0.710, auc: 0.782
----step 35 accuracy: 0.710, auc: 0.783
----step 36 accuracy: 0.715, auc: 0.784
----step 37 accuracy: 0.725, auc: 0.785
----step 38 accuracy: 0.725, auc: 0.787
----step 39 accuracy: 0.725, auc: 0.787
----step 40 accuracy: 0.725, auc: 0.789
----step 41 accuracy: 0.725, auc: 0.789
----step 42 accuracy: 0.725, auc: 0.790
----step 43 accuracy: 0.720, auc: 0.790
----step 44 accuracy: 0.725, auc: 0.791
----step 45 accuracy: 0.730, auc: 0.792
----step 46 accuracy: 0.730, auc: 0.793
----step 47 accuracy: 0.735, auc: 0.793
----step 48 accuracy: 0.735, auc: 0.793
----step 49 accuracy: 0.740, auc: 0.794
----step 50 accuracy: 0.740, auc: 0.794
----step 51 accuracy: 0.740, auc: 0.795
----step 52 accuracy: 0.740, auc: 0.795
----step 53 accuracy: 0.755, auc: 0.796
----step 54 accuracy: 0.760, auc: 0.797
----step 55 accuracy: 0.755, auc: 0.798
----step 56 accuracy: 0.755, auc: 0.799
----step 57 accuracy: 0.750, auc: 0.799
----step 58 accuracy: 0.750, auc: 0.799
----step 59 accuracy: 0.750, auc: 0.799
----step 60 accuracy: 0.750, auc: 0.800
----step 61 accuracy: 0.750, auc: 0.800
----step 62 accuracy: 0.755, auc: 0.801
----step 63 accuracy: 0.750, auc: 0.802
----step 64 accuracy: 0.750, auc: 0.802
----step 65 accuracy: 0.750, auc: 0.802
----step 66 accuracy: 0.750, auc: 0.802
----step 67 accuracy: 0.750, auc: 0.802
----step 68 accuracy: 0.750, auc: 0.803
----step 69 accuracy: 0.750, auc: 0.803
----step 70 accuracy: 0.750, auc: 0.803
----step 71 accuracy: 0.745, auc: 0.804
----step 72 accuracy: 0.745, auc: 0.804
----step 73 accuracy: 0.750, auc: 0.804
----step 74 accuracy: 0.750, auc: 0.804
----step 75 accuracy: 0.755, auc: 0.804
----step 76 accuracy: 0.755, auc: 0.805
----step 77 accuracy: 0.750, auc: 0.805
----step 78 accuracy: 0.750, auc: 0.805
----step 79 accuracy: 0.760, auc: 0.806
----step 80 accuracy: 0.760, auc: 0.806
----step 81 accuracy: 0.760, auc: 0.807
----step 82 accuracy: 0.765, auc: 0.807
----step 83 accuracy: 0.765, auc: 0.807
----step 84 accuracy: 0.765, auc: 0.807
----step 85 accuracy: 0.760, auc: 0.808
----step 86 accuracy: 0.760, auc: 0.808
----step 87 accuracy: 0.760, auc: 0.808
----step 88 accuracy: 0.760, auc: 0.809
----step 89 accuracy: 0.760, auc: 0.809
----step 90 accuracy: 0.755, auc: 0.809
----step 91 accuracy: 0.755, auc: 0.809
----step 92 accuracy: 0.755, auc: 0.809
----step 93 accuracy: 0.755, auc: 0.810
----step 94 accuracy: 0.755, auc: 0.809
----step 95 accuracy: 0.755, auc: 0.810
----step 96 accuracy: 0.760, auc: 0.811
----step 97 accuracy: 0.760, auc: 0.811
----step 98 accuracy: 0.760, auc: 0.811
----step 99 accuracy: 0.760, auc: 0.812
----step100 accuracy: 0.760, auc: 0.812
----step101 accuracy: 0.760, auc: 0.812
----step102 accuracy: 0.760, auc: 0.812
----step103 accuracy: 0.760, auc: 0.812
----step104 accuracy: 0.760, auc: 0.813
----step105 accuracy: 0.760, auc: 0.813
----step106 accuracy: 0.760, auc: 0.813
----step107 accuracy: 0.760, auc: 0.813
----step108 accuracy: 0.760, auc: 0.813
----step109 accuracy: 0.760, auc: 0.813
----step110 accuracy: 0.760, auc: 0.813
----step111 accuracy: 0.765, auc: 0.814
----step112 accuracy: 0.765, auc: 0.814
----step113 accuracy: 0.765, auc: 0.814
----step114 accuracy: 0.765, auc: 0.814
----step115 accuracy: 0.765, auc: 0.814
----step116 accuracy: 0.765, auc: 0.814
----step117 accuracy: 0.765, auc: 0.814
----step118 accuracy: 0.765, auc: 0.814
----step119 accuracy: 0.765, auc: 0.814
----step120 accuracy: 0.765, auc: 0.814
----step121 accuracy: 0.770, auc: 0.814
----step122 accuracy: 0.770, auc: 0.815
----step123 accuracy: 0.770, auc: 0.814
----step124 accuracy: 0.770, auc: 0.814
----step125 accuracy: 0.770, auc: 0.814
----step126 accuracy: 0.770, auc: 0.815
----step127 accuracy: 0.770, auc: 0.815
----step128 accuracy: 0.770, auc: 0.815
----step129 accuracy: 0.785, auc: 0.815
----step130 accuracy: 0.785, auc: 0.815
----step131 accuracy: 0.785, auc: 0.815
----step132 accuracy: 0.785, auc: 0.816
----step133 accuracy: 0.785, auc: 0.815
----step134 accuracy: 0.785, auc: 0.816
----step135 accuracy: 0.785, auc: 0.816
----step136 accuracy: 0.785, auc: 0.816
----step137 accuracy: 0.785, auc: 0.816
----step138 accuracy: 0.785, auc: 0.816
----step139 accuracy: 0.785, auc: 0.816
----step140 accuracy: 0.785, auc: 0.816
----step141 accuracy: 0.785, auc: 0.816
----step142 accuracy: 0.785, auc: 0.816
----step143 accuracy: 0.785, auc: 0.816
----step144 accuracy: 0.785, auc: 0.816
----step145 accuracy: 0.785, auc: 0.816
----step146 accuracy: 0.785, auc: 0.817
----step147 accuracy: 0.785, auc: 0.817
----step148 accuracy: 0.785, auc: 0.817
----step149 accuracy: 0.785, auc: 0.816
----step150 accuracy: 0.785, auc: 0.816
----step151 accuracy: 0.785, auc: 0.816
----step152 accuracy: 0.785, auc: 0.816
----step153 accuracy: 0.785, auc: 0.817
----step154 accuracy: 0.785, auc: 0.816
----step155 accuracy: 0.785, auc: 0.816
----step156 accuracy: 0.785, auc: 0.816
----step157 accuracy: 0.785, auc: 0.816
----step158 accuracy: 0.785, auc: 0.817
----step159 accuracy: 0.785, auc: 0.817
----step160 accuracy: 0.785, auc: 0.817
----step161 accuracy: 0.785, auc: 0.817
----step162 accuracy: 0.785, auc: 0.817
----step163 accuracy: 0.785, auc: 0.817
----step164 accuracy: 0.785, auc: 0.818
----step165 accuracy: 0.785, auc: 0.818
----step166 accuracy: 0.785, auc: 0.818
----step167 accuracy: 0.785, auc: 0.818
----step168 accuracy: 0.785, auc: 0.818
----step169 accuracy: 0.785, auc: 0.819
----step170 accuracy: 0.785, auc: 0.819
----step171 accuracy: 0.785, auc: 0.819
----step172 accuracy: 0.780, auc: 0.819
----step173 accuracy: 0.780, auc: 0.820
----step174 accuracy: 0.780, auc: 0.820
----step175 accuracy: 0.780, auc: 0.820
----step176 accuracy: 0.785, auc: 0.820
----step177 accuracy: 0.785, auc: 0.820
----step178 accuracy: 0.785, auc: 0.820
----step179 accuracy: 0.785, auc: 0.820
----step180 accuracy: 0.785, auc: 0.820
----step181 accuracy: 0.780, auc: 0.820
----step182 accuracy: 0.780, auc: 0.820
----step183 accuracy: 0.780, auc: 0.820
----step184 accuracy: 0.780, auc: 0.820
----step185 accuracy: 0.780, auc: 0.820
----step186 accuracy: 0.780, auc: 0.820
----step187 accuracy: 0.780, auc: 0.820
----step188 accuracy: 0.780, auc: 0.820
----step189 accuracy: 0.780, auc: 0.820
----step190 accuracy: 0.780, auc: 0.820
----step191 accuracy: 0.780, auc: 0.820
----step192 accuracy: 0.780, auc: 0.820
----step193 accuracy: 0.780, auc: 0.820
----step194 accuracy: 0.775, auc: 0.820
----step195 accuracy: 0.775, auc: 0.821
----step196 accuracy: 0.775, auc: 0.821
----step197 accuracy: 0.775, auc: 0.821
----step198 accuracy: 0.775, auc: 0.821
----step199 accuracy: 0.775, auc: 0.821
----step200 accuracy: 0.780, auc: 0.821
----step201 accuracy: 0.780, auc: 0.821
----step202 accuracy: 0.780, auc: 0.821
----step203 accuracy: 0.780, auc: 0.821
----step204 accuracy: 0.780, auc: 0.821
----step205 accuracy: 0.780, auc: 0.821
----step206 accuracy: 0.780, auc: 0.821
----step207 accuracy: 0.780, auc: 0.822
----step208 accuracy: 0.780, auc: 0.822
----step209 accuracy: 0.780, auc: 0.821
----step210 accuracy: 0.780, auc: 0.821
----step211 accuracy: 0.780, auc: 0.821
----step212 accuracy: 0.780, auc: 0.821
----step213 accuracy: 0.780, auc: 0.822
----step214 accuracy: 0.780, auc: 0.822
----step215 accuracy: 0.780, auc: 0.822
----step216 accuracy: 0.780, auc: 0.822
----step217 accuracy: 0.780, auc: 0.822
----step218 accuracy: 0.780, auc: 0.822
----step219 accuracy: 0.780, auc: 0.822
----step220 accuracy: 0.780, auc: 0.822
----step221 accuracy: 0.780, auc: 0.823
----step222 accuracy: 0.780, auc: 0.823
----step223 accuracy: 0.780, auc: 0.823
----step224 accuracy: 0.780, auc: 0.823
----step225 accuracy: 0.780, auc: 0.823
----step226 accuracy: 0.780, auc: 0.823
----step227 accuracy: 0.780, auc: 0.823
----step228 accuracy: 0.780, auc: 0.823
----step229 accuracy: 0.780, auc: 0.823
----step230 accuracy: 0.780, auc: 0.823
----step231 accuracy: 0.780, auc: 0.823
----step232 accuracy: 0.780, auc: 0.823
----step233 accuracy: 0.780, auc: 0.824
----step234 accuracy: 0.780, auc: 0.824
----step235 accuracy: 0.780, auc: 0.824
----step236 accuracy: 0.780, auc: 0.824
----step237 accuracy: 0.780, auc: 0.824
----step238 accuracy: 0.780, auc: 0.824
----step239 accuracy: 0.780, auc: 0.824
----step240 accuracy: 0.780, auc: 0.823
----step241 accuracy: 0.780, auc: 0.824
----step242 accuracy: 0.780, auc: 0.824
----step243 accuracy: 0.780, auc: 0.824
----step244 accuracy: 0.780, auc: 0.824
----step245 accuracy: 0.780, auc: 0.824
----step246 accuracy: 0.780, auc: 0.824
----step247 accuracy: 0.780, auc: 0.824
----step248 accuracy: 0.780, auc: 0.825
----step249 accuracy: 0.780, auc: 0.825
----step250 accuracy: 0.780, auc: 0.825
----step251 accuracy: 0.780, auc: 0.825
----step252 accuracy: 0.780, auc: 0.825
----step253 accuracy: 0.780, auc: 0.825
----step254 accuracy: 0.780, auc: 0.825
----step255 accuracy: 0.780, auc: 0.825
----step256 accuracy: 0.780, auc: 0.825
----step257 accuracy: 0.775, auc: 0.825
----step258 accuracy: 0.775, auc: 0.825
----step259 accuracy: 0.775, auc: 0.825
----step260 accuracy: 0.775, auc: 0.825
----step261 accuracy: 0.775, auc: 0.825
----step262 accuracy: 0.775, auc: 0.825
----step263 accuracy: 0.775, auc: 0.825
----step264 accuracy: 0.775, auc: 0.825
----step265 accuracy: 0.775, auc: 0.825
----step266 accuracy: 0.775, auc: 0.825
----step267 accuracy: 0.775, auc: 0.825
----step268 accuracy: 0.775, auc: 0.825
----step269 accuracy: 0.775, auc: 0.825
----step270 accuracy: 0.775, auc: 0.825
----step271 accuracy: 0.775, auc: 0.825
----step272 accuracy: 0.775, auc: 0.825
----step273 accuracy: 0.775, auc: 0.825
----step274 accuracy: 0.775, auc: 0.825
----step275 accuracy: 0.775, auc: 0.825
----step276 accuracy: 0.775, auc: 0.825
----step277 accuracy: 0.775, auc: 0.825
----step278 accuracy: 0.775, auc: 0.825
----step279 accuracy: 0.775, auc: 0.826
----step280 accuracy: 0.775, auc: 0.826
----step281 accuracy: 0.775, auc: 0.826
----step282 accuracy: 0.775, auc: 0.826
----step283 accuracy: 0.775, auc: 0.826
----step284 accuracy: 0.775, auc: 0.826
----step285 accuracy: 0.775, auc: 0.826
----step286 accuracy: 0.775, auc: 0.827
----step287 accuracy: 0.775, auc: 0.827
----step288 accuracy: 0.775, auc: 0.827
----step289 accuracy: 0.775, auc: 0.827
----step290 accuracy: 0.775, auc: 0.827
----step291 accuracy: 0.775, auc: 0.826
----step292 accuracy: 0.775, auc: 0.827
----step293 accuracy: 0.775, auc: 0.827
----step294 accuracy: 0.775, auc: 0.827
----step295 accuracy: 0.775, auc: 0.827
----step296 accuracy: 0.775, auc: 0.826
----step297 accuracy: 0.775, auc: 0.826
----step298 accuracy: 0.775, auc: 0.827
----step299 accuracy: 0.775, auc: 0.827
----step300 accuracy: 0.775, auc: 0.827
----step301 accuracy: 0.775, auc: 0.827
----step302 accuracy: 0.775, auc: 0.827
----step303 accuracy: 0.775, auc: 0.827
----step304 accuracy: 0.775, auc: 0.827
----step305 accuracy: 0.775, auc: 0.827
----step306 accuracy: 0.775, auc: 0.827
----step307 accuracy: 0.775, auc: 0.827
----step308 accuracy: 0.775, auc: 0.827
----step309 accuracy: 0.770, auc: 0.827
----step310 accuracy: 0.770, auc: 0.827
----step311 accuracy: 0.770, auc: 0.827
----step312 accuracy: 0.770, auc: 0.827
----step313 accuracy: 0.770, auc: 0.827
----step314 accuracy: 0.770, auc: 0.827
----step315 accuracy: 0.775, auc: 0.827
----step316 accuracy: 0.775, auc: 0.827
----step317 accuracy: 0.775, auc: 0.827
----step318 accuracy: 0.775, auc: 0.827
----step319 accuracy: 0.775, auc: 0.827
----step320 accuracy: 0.775, auc: 0.827
----step321 accuracy: 0.775, auc: 0.827
----step322 accuracy: 0.780, auc: 0.827
----step323 accuracy: 0.775, auc: 0.827
----step324 accuracy: 0.775, auc: 0.827
----step325 accuracy: 0.775, auc: 0.827
----step326 accuracy: 0.775, auc: 0.827
----step327 accuracy: 0.775, auc: 0.827
----step328 accuracy: 0.775, auc: 0.827
----step329 accuracy: 0.775, auc: 0.827
----step330 accuracy: 0.780, auc: 0.827
----step331 accuracy: 0.780, auc: 0.827
----step332 accuracy: 0.780, auc: 0.827
----step333 accuracy: 0.785, auc: 0.828
----step334 accuracy: 0.785, auc: 0.828
----step335 accuracy: 0.785, auc: 0.828
----step336 accuracy: 0.785, auc: 0.828
----step337 accuracy: 0.785, auc: 0.828
----step338 accuracy: 0.785, auc: 0.828
----step339 accuracy: 0.785, auc: 0.828
----step340 accuracy: 0.785, auc: 0.828
----step341 accuracy: 0.785, auc: 0.828
----step342 accuracy: 0.785, auc: 0.828
----step343 accuracy: 0.785, auc: 0.828
----step344 accuracy: 0.785, auc: 0.828
----step345 accuracy: 0.785, auc: 0.828
----step346 accuracy: 0.790, auc: 0.828
----step347 accuracy: 0.790, auc: 0.828
----step348 accuracy: 0.790, auc: 0.828
----step349 accuracy: 0.790, auc: 0.828
----step350 accuracy: 0.790, auc: 0.828
----step351 accuracy: 0.790, auc: 0.828
----step352 accuracy: 0.790, auc: 0.828
----step353 accuracy: 0.790, auc: 0.828
----step354 accuracy: 0.790, auc: 0.828
----step355 accuracy: 0.790, auc: 0.827
----step356 accuracy: 0.790, auc: 0.827
----step357 accuracy: 0.790, auc: 0.827
----step358 accuracy: 0.790, auc: 0.827
----step359 accuracy: 0.790, auc: 0.828
----step360 accuracy: 0.790, auc: 0.828
----step361 accuracy: 0.790, auc: 0.828
----step362 accuracy: 0.790, auc: 0.828
----step363 accuracy: 0.790, auc: 0.827
----step364 accuracy: 0.790, auc: 0.827
----step365 accuracy: 0.790, auc: 0.827
----step366 accuracy: 0.790, auc: 0.828
----step367 accuracy: 0.790, auc: 0.828
----step368 accuracy: 0.790, auc: 0.828
----step369 accuracy: 0.790, auc: 0.828
----step370 accuracy: 0.790, auc: 0.828
----step371 accuracy: 0.790, auc: 0.828
----step372 accuracy: 0.790, auc: 0.828
----step373 accuracy: 0.790, auc: 0.828
----step374 accuracy: 0.790, auc: 0.828
----step375 accuracy: 0.790, auc: 0.828
----step376 accuracy: 0.790, auc: 0.828
----step377 accuracy: 0.790, auc: 0.828
----step378 accuracy: 0.790, auc: 0.828
----step379 accuracy: 0.790, auc: 0.828
----step380 accuracy: 0.790, auc: 0.828
----step381 accuracy: 0.790, auc: 0.828
----step382 accuracy: 0.790, auc: 0.828
----step383 accuracy: 0.790, auc: 0.828
----step384 accuracy: 0.790, auc: 0.828
----step385 accuracy: 0.790, auc: 0.828
----step386 accuracy: 0.790, auc: 0.828
----step387 accuracy: 0.790, auc: 0.828
----step388 accuracy: 0.790, auc: 0.828
----step389 accuracy: 0.790, auc: 0.828
----step390 accuracy: 0.790, auc: 0.828
----step391 accuracy: 0.790, auc: 0.828
----step392 accuracy: 0.790, auc: 0.828
----step393 accuracy: 0.790, auc: 0.828
----step394 accuracy: 0.790, auc: 0.828
----step395 accuracy: 0.790, auc: 0.828
----step396 accuracy: 0.790, auc: 0.828
----step397 accuracy: 0.790, auc: 0.827
----step398 accuracy: 0.790, auc: 0.828
----step399 accuracy: 0.790, auc: 0.828

Some References

  1. Rendle, Steffen. "Factorization machines." Data Mining (ICDM), 2010 IEEE 10th International Conference on. IEEE, 2010.
  2. Juan, Yuchin, et al. "Field-aware factorization machines for CTR prediction." Proceedings of the 10th ACM Conference on Recommender Systems. ACM, 2016.
  3. Guo, Huifeng, et al. "Deepfm: a factorization-machine based neural network for ctr prediction." arXiv preprint arXiv:1703.04247 (2017).
  4. Qu, Yanru, et al. "Product-based neural networks for user response prediction." Data Mining (ICDM), 2016 IEEE 16th International Conference on. IEEE, 2016.