rank-consistent ordinal regression for neural networks · 1 rank-consistent ordinal regression for...

8
1 Rank-consistent ordinal regression for neural networks Wenzhi Cao a , Vahid Mirjalili b , Sebastian Raschka a,** a University of Wisconsin-Madison, Department of Statistics, 1300 University Ave, Madison, WI 53705, USA b Michigan State University, Department of Computer Science & Engineering, 428 South Shaw Lane, East Lansing, MI 48824, USA ABSTRACT In many real-world predictions tasks, class labels include information about the relative ordering be- tween labels, which is not captured by commonly-used loss functions such as multi-category cross-en- tropy. Recently, ordinal regression frameworks have been adopted by the deep learning community to take such ordering information into account. Using a framework that transforms ordinal targets into binary classification subtasks, neural networks were equipped with ordinal regression capabili- ties. However, this method suers from inconsistencies among the dierent binary classifiers. We hypothesize that addressing the inconsistency issue in these binary classification task-based neural networks improves predictive performance. To test this hypothesis, we propose the COnsistent RAnk Logits (CORAL) framework with strong theoretical guarantees for rank-monotonicity and consistent confidence scores. Moreover, the proposed method is architecture-agnostic and can extend arbitrary state-of-the-art deep neural network classifiers for ordinal regression tasks. The empirical evaluation of the proposed rank-consistent method on a range of face-image datasets for age prediction shows a substantial reduction of the prediction error compared to the reference ordinal regression network. 1. Introduction Ordinal regression (also called ordinal classification), de- scribes the task of predicting labels on an ordinal scale. Here, a ranking rule or classifier h maps each object x i ∈X into an ordered set h : X→Y, where Y = {r 1 ... r K }. In contrast to classification, the labels provide enough information to order objects. However, as opposed to metric regression, the dier- ence between label values is arbitrary. While the field of machine learning has developed many powerful algorithms for predictive modeling, most algorithms have been designed for classification tasks. The extended bi- nary classification approach proposed by Li and Lin (Li and Lin, 2007) forms the basis of many ordinal regression imple- mentations. However, neural network-based implementations of this approach commonly suer from classifier inconsisten- cies among the binary rankings (Niu et al., 2016). We address the inconsistency issue in this paper with a new method and theorem for guaranteed classifier consistency that can easily be implemented in various neural network architectures. ** Corresponding author: e-mail: [email protected] (Sebastian Raschka) Furthermore, this study presents an empirical analysis of our approach to challenging real-world datasets for predicting the age of individuals from face images using our method with con- volutional neural networks (CNN). Aging can be regarded as a non-stationary process since age progression in early child- hood is primarily associated with changes in the shape of the face. In contrast, aging during adulthood is defined mainly by changes in skin texture (Ramanathan et al., 2009). Based on this assumption, the age prediction can be modeled using or- dinal regression-based approaches (Yang et al., 2010; Chang et al., 2011; Cao et al., 2012; Li et al., 2012). The main contributions of this paper are as follows: 1. The consistent rank logits (CORAL) framework for or- dinal regression with theoretical guarantees for classifier consistency; 2. Implementation of CORAL to adapt common CNN archi- tectures, such as ResNet (He et al., 2016), for ordinal re- gression; 3. Experiments on dierent age estimation datasets showing that CORAL’s guaranteed binary classifier consistency im- proves predictive performance compared to the reference framework for ordinal regression. arXiv:1901.07884v5 [cs.LG] 23 Jun 2020

Upload: others

Post on 29-Sep-2020

14 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Rank-consistent ordinal regression for neural networks · 1 Rank-consistent ordinal regression for neural networks WenzhiCaoa, VahidMirjalilib, SebastianRaschkaa, aUniversity of Wisconsin-Madison,

1

Rank-consistent ordinal regression for neural networks

Wenzhi Caoa, Vahid Mirjalilib, Sebastian Raschkaa,∗∗

aUniversity of Wisconsin-Madison, Department of Statistics, 1300 University Ave, Madison, WI 53705, USAbMichigan State University, Department of Computer Science & Engineering, 428 South Shaw Lane, East Lansing, MI 48824, USA

ABSTRACT

In many real-world predictions tasks, class labels include information about the relative ordering be-tween labels, which is not captured by commonly-used loss functions such as multi-category cross-en-tropy. Recently, ordinal regression frameworks have been adopted by the deep learning communityto take such ordering information into account. Using a framework that transforms ordinal targetsinto binary classification subtasks, neural networks were equipped with ordinal regression capabili-ties. However, this method suffers from inconsistencies among the different binary classifiers. Wehypothesize that addressing the inconsistency issue in these binary classification task-based neuralnetworks improves predictive performance. To test this hypothesis, we propose the COnsistent RAnkLogits (CORAL) framework with strong theoretical guarantees for rank-monotonicity and consistentconfidence scores. Moreover, the proposed method is architecture-agnostic and can extend arbitrarystate-of-the-art deep neural network classifiers for ordinal regression tasks. The empirical evaluationof the proposed rank-consistent method on a range of face-image datasets for age prediction shows asubstantial reduction of the prediction error compared to the reference ordinal regression network.

1. Introduction

Ordinal regression (also called ordinal classification), de-scribes the task of predicting labels on an ordinal scale. Here,a ranking rule or classifier h maps each object xi ∈ X into anordered set h : X → Y, where Y = {r1 ≺ ... ≺ rK}. In contrastto classification, the labels provide enough information to orderobjects. However, as opposed to metric regression, the differ-ence between label values is arbitrary.

While the field of machine learning has developed manypowerful algorithms for predictive modeling, most algorithmshave been designed for classification tasks. The extended bi-nary classification approach proposed by Li and Lin (Li andLin, 2007) forms the basis of many ordinal regression imple-mentations. However, neural network-based implementationsof this approach commonly suffer from classifier inconsisten-cies among the binary rankings (Niu et al., 2016). We addressthe inconsistency issue in this paper with a new method andtheorem for guaranteed classifier consistency that can easily beimplemented in various neural network architectures.

∗∗Corresponding author:e-mail: [email protected] (Sebastian Raschka)

Furthermore, this study presents an empirical analysis of ourapproach to challenging real-world datasets for predicting theage of individuals from face images using our method with con-volutional neural networks (CNN). Aging can be regarded asa non-stationary process since age progression in early child-hood is primarily associated with changes in the shape of theface. In contrast, aging during adulthood is defined mainly bychanges in skin texture (Ramanathan et al., 2009). Based onthis assumption, the age prediction can be modeled using or-dinal regression-based approaches (Yang et al., 2010; Changet al., 2011; Cao et al., 2012; Li et al., 2012).

The main contributions of this paper are as follows:

1. The consistent rank logits (CORAL) framework for or-dinal regression with theoretical guarantees for classifierconsistency;

2. Implementation of CORAL to adapt common CNN archi-tectures, such as ResNet (He et al., 2016), for ordinal re-gression;

3. Experiments on different age estimation datasets showingthat CORAL’s guaranteed binary classifier consistency im-proves predictive performance compared to the referenceframework for ordinal regression.

arX

iv:1

901.

0788

4v5

[cs

.LG

] 2

3 Ju

n 20

20

Page 2: Rank-consistent ordinal regression for neural networks · 1 Rank-consistent ordinal regression for neural networks WenzhiCaoa, VahidMirjalilib, SebastianRaschkaa, aUniversity of Wisconsin-Madison,

2

2. Related Work

2.1. Ordinal Regression and Ranking

Several multivariate extensions of generalized linear mod-els have been developed for ordinal regression in the past, in-cluding the popular proportional odds and proportional haz-ards models (McCullagh, 1980). Moreover, the machine learn-ing field developed ordinal regression models based on ex-tensions of well-studied classification algorithms, by refor-mulating the problem to utilize multiple binary classificationtasks (Baccianella et al., 2009). Early work in this regard in-cludes the use of perceptrons (Crammer and Singer, 2002; Shenand Joshi, 2005) and support vector machines (Herbrich et al.,1999; Shashua and Levin, 2003; Rajaram et al., 2003; Chu andKeerthi, 2005). Li and Lin (2007) proposed a general reductionframework that unified the view of a number of these existingalgorithms.

2.2. Ordinal Regression CNN

While earlier works on using CNNs for ordinal targetshave employed conventional classification approaches (Leviand Hassner, 2015; Rothe et al., 2015), the general reductionframework from ordinal regression to binary classification byLi and Lin (2007) was recently adopted by Niu et al. (2016)as Ordinal Regression CNN (OR-CNN). In the OR-CNN ap-proach, an ordinal regression problem with K ranks is trans-formed into K − 1 binary classification problems, with the k-thtask predicting whether the age label of a face image exceedsrank rk, k = 1, ...,K − 1. All K−1 tasks share the same interme-diate layers but are assigned distinct weight parameters in theoutput layer.

While the OR-CNN was able to achieve state-of-the-art per-formance on benchmark datasets, it does not guarantee con-sistent predictions, such that predictions for individual binarytasks may disagree. For example, in an age estimation setting,it would be contradictory if the k-th binary task predicted thatthe age of a person was more than 30, but a previous task pre-dicted the person’s age was less than 20. This inconsistencycould be suboptimal when the K − 1 task predictions are com-bined to obtain the estimated age.

Niu et al. (2016) acknowledged the classifier inconsistencyas not being ideal and also noted that ensuring the K − 1 binaryclassifiers are consistent would increase the training complexitysubstantially (Niu et al., 2016). The CORAL method proposedin this paper addresses both these issues with a theoretical guar-antee for classifier consistency and without increasing the train-ing complexity.

2.3. Other CNN Architectures for Age Estimation

Chen et al. (2017) proposed a modification of the OR-CNN (Niu et al., 2016), known as Ranking-CNN, that usesan ensemble of CNNs for binary classifications and aggregatesthe predictions to estimate the age label of a given face image.The researchers showed that training an ensemble of CNNs im-proves the predictive performance over a single CNN with mul-tiple binary outputs (Chen et al., 2017), which is consistent withthe well-known fact that an ensemble model can achieve better

generalization performance than each individual classifier in theensemble (Raschka and Mirjalili, 2019).

Recent research has also shown that training a multi-taskCNN that shares lower-layer parameters for various face analy-sis tasks (face detection, gender prediction, age estimation, etc.)can improve the overall performance across different tasks com-pared to a single-task CNN (Ranjan et al., 2017).

Another approach for utilizing binary classifiers for ordinalregression is the siamese CNN architecture proposed by Polaniaet al. (2019), which computes the rank from pair-wise compar-isons between the input image and multiple, carefully selectedanchor images.

3. Proposed Method

This section describes our proposed CORAL framework thataddresses the problem of classifier inconsistency in the ordinalregression CNN (OR-CNN) by Niu et al. (2016), which is basedon multiple binary classification tasks for ranking.

3.1. Preliminaries

Let D = {xi, yi}Ni=1 be the training dataset consisting of N

training examples. Here, xi ∈ X denotes the i-th training exam-ple and yi the corresponding rank, where yi ∈ Y = {r1, r2, ...rK}

with ordered rank rK � rK−1 � . . . � r1. The ordinal regressiontask is to find a ranking rule h : X → Y such that a loss functionL(h) is minimized.

Let C be a K×K cost matrix, where Cy,rk is the cost of predict-ing an example (x, y) as rank rk (Li and Lin, 2007). Typically,Cy,y = 0 and Cy,rk > 0 for y 6= rk. In ordinal regression, we gen-erally prefer each row of the cost matrix to be V-shaped, thatis, Cy,rk−1 ≥ Cy,rk if rk ≤ y and Cy,rk ≤ Cy,rk+1 if rk ≥ y. The clas-sification cost matrix has entries Cy,rk = 1{y 6= rk} that do notconsider ordering information. In ordinal regression, where theranks are treated as numerical values, the absolute cost matrixis commonly defined by Cy,rk = |y − rk |.

Li and Lin (2007) proposed a general reduction frameworkfor extending an ordinal regression problem into several binaryclassification problems. This framework requires a cost ma-trix that is convex in each row (Cy,rk+1 − Cy,rk ≥ Cy,rk − Cy,rk−1 foreach y) to obtain a rank-monotonic threshold model. Since thecost-related weighting of each binary task is specific for eachtraining example, this approach is considered as infeasible inpractice due to its high training complexity (Niu et al., 2016).

Our proposed CORAL framework does neither require acost matrix with convex-row conditions nor explicit weightingterms that depend on each training example to obtain a rank-monotonic threshold model and produce consistent predictionsfor each binary task.

3.2. Ordinal Regression with a Consistent Rank Logits Model

In this section, we describe our proposed consistent ranklogits (CORAL) framework for ordinal regression. Subsec-tion 3.2.1 describes the label extension into binary tasks usedfor rank prediction. The loss function of the CORAL frame-work is described in Subsection 3.2.2. In subsection 3.2.3, we

Page 3: Rank-consistent ordinal regression for neural networks · 1 Rank-consistent ordinal regression for neural networks WenzhiCaoa, VahidMirjalilib, SebastianRaschkaa, aUniversity of Wisconsin-Madison,

3

Input image

7x7 conv@64stride=2 … 3x3 conv

@512stride=1

...

Weight sharingacross k-1 tasks

...

b1<latexit sha1_base64="oxRwFKiWcKO5u9w5ICndUHjxWWI=">AAAB6nicbZBNS8NAEIYn9avWr6hHL4tF8FQSEfRY9OKxov2ANpTNdtMu3WzC7kQooT/BiwdFvPqLvPlv3LY5aOsLCw/vzLAzb5hKYdDzvp3S2vrG5lZ5u7Kzu7d/4B4etUySacabLJGJ7oTUcCkUb6JAyTup5jQOJW+H49tZvf3EtRGJesRJyoOYDpWIBKNorYew7/fdqlfz5iKr4BdQhUKNvvvVGyQsi7lCJqkxXd9LMcipRsEkn1Z6meEpZWM65F2LisbcBPl81Sk5s86ARIm2TyGZu78nchobM4lD2xlTHJnl2sz8r9bNMLoOcqHSDLlii4+iTBJMyOxuMhCaM5QTC5RpYXclbEQ1ZWjTqdgQ/OWTV6F1UfMt319W6zdFHGU4gVM4Bx+uoA530IAmMBjCM7zCmyOdF+fd+Vi0lpxi5hj+yPn8AeofjYo=</latexit><latexit sha1_base64="oxRwFKiWcKO5u9w5ICndUHjxWWI=">AAAB6nicbZBNS8NAEIYn9avWr6hHL4tF8FQSEfRY9OKxov2ANpTNdtMu3WzC7kQooT/BiwdFvPqLvPlv3LY5aOsLCw/vzLAzb5hKYdDzvp3S2vrG5lZ5u7Kzu7d/4B4etUySacabLJGJ7oTUcCkUb6JAyTup5jQOJW+H49tZvf3EtRGJesRJyoOYDpWIBKNorYew7/fdqlfz5iKr4BdQhUKNvvvVGyQsi7lCJqkxXd9LMcipRsEkn1Z6meEpZWM65F2LisbcBPl81Sk5s86ARIm2TyGZu78nchobM4lD2xlTHJnl2sz8r9bNMLoOcqHSDLlii4+iTBJMyOxuMhCaM5QTC5RpYXclbEQ1ZWjTqdgQ/OWTV6F1UfMt319W6zdFHGU4gVM4Bx+uoA530IAmMBjCM7zCmyOdF+fd+Vi0lpxi5hj+yPn8AeofjYo=</latexit><latexit sha1_base64="oxRwFKiWcKO5u9w5ICndUHjxWWI=">AAAB6nicbZBNS8NAEIYn9avWr6hHL4tF8FQSEfRY9OKxov2ANpTNdtMu3WzC7kQooT/BiwdFvPqLvPlv3LY5aOsLCw/vzLAzb5hKYdDzvp3S2vrG5lZ5u7Kzu7d/4B4etUySacabLJGJ7oTUcCkUb6JAyTup5jQOJW+H49tZvf3EtRGJesRJyoOYDpWIBKNorYew7/fdqlfz5iKr4BdQhUKNvvvVGyQsi7lCJqkxXd9LMcipRsEkn1Z6meEpZWM65F2LisbcBPl81Sk5s86ARIm2TyGZu78nchobM4lD2xlTHJnl2sz8r9bNMLoOcqHSDLlii4+iTBJMyOxuMhCaM5QTC5RpYXclbEQ1ZWjTqdgQ/OWTV6F1UfMt319W6zdFHGU4gVM4Bx+uoA530IAmMBjCM7zCmyOdF+fd+Vi0lpxi5hj+yPn8AeofjYo=</latexit><latexit sha1_base64="oxRwFKiWcKO5u9w5ICndUHjxWWI=">AAAB6nicbZBNS8NAEIYn9avWr6hHL4tF8FQSEfRY9OKxov2ANpTNdtMu3WzC7kQooT/BiwdFvPqLvPlv3LY5aOsLCw/vzLAzb5hKYdDzvp3S2vrG5lZ5u7Kzu7d/4B4etUySacabLJGJ7oTUcCkUb6JAyTup5jQOJW+H49tZvf3EtRGJesRJyoOYDpWIBKNorYew7/fdqlfz5iKr4BdQhUKNvvvVGyQsi7lCJqkxXd9LMcipRsEkn1Z6meEpZWM65F2LisbcBPl81Sk5s86ARIm2TyGZu78nchobM4lD2xlTHJnl2sz8r9bNMLoOcqHSDLlii4+iTBJMyOxuMhCaM5QTC5RpYXclbEQ1ZWjTqdgQ/OWTV6F1UfMt319W6zdFHGU4gVM4Bx+uoA530IAmMBjCM7zCmyOdF+fd+Vi0lpxi5hj+yPn8AeofjYo=</latexit>

b2<latexit sha1_base64="KmwIREzjRQ4yYecphZP2rZMfohI=">AAAB6nicbZBNS8NAEIYn9avWr6pHL4tF8FSSItRj0YvHivYD2lA220m7dLMJuxuhhP4ELx4U8eov8ua/cdvmoK0vLDy8M8POvEEiuDau++0UNja3tneKu6W9/YPDo/LxSVvHqWLYYrGIVTegGgWX2DLcCOwmCmkUCOwEk9t5vfOESvNYPpppgn5ER5KHnFFjrYdgUBuUK27VXYisg5dDBXI1B+Wv/jBmaYTSMEG17nluYvyMKsOZwFmpn2pMKJvQEfYsShqh9rPFqjNyYZ0hCWNlnzRk4f6eyGik9TQKbGdEzViv1ubmf7VeasJrP+MySQ1KtvwoTAUxMZnfTYZcITNiaoEyxe2uhI2poszYdEo2BG/15HVo16qe5furSuMmj6MIZ3AOl+BBHRpwB01oAYMRPMMrvDnCeXHenY9la8HJZ07hj5zPH+ujjYs=</latexit><latexit sha1_base64="KmwIREzjRQ4yYecphZP2rZMfohI=">AAAB6nicbZBNS8NAEIYn9avWr6pHL4tF8FSSItRj0YvHivYD2lA220m7dLMJuxuhhP4ELx4U8eov8ua/cdvmoK0vLDy8M8POvEEiuDau++0UNja3tneKu6W9/YPDo/LxSVvHqWLYYrGIVTegGgWX2DLcCOwmCmkUCOwEk9t5vfOESvNYPpppgn5ER5KHnFFjrYdgUBuUK27VXYisg5dDBXI1B+Wv/jBmaYTSMEG17nluYvyMKsOZwFmpn2pMKJvQEfYsShqh9rPFqjNyYZ0hCWNlnzRk4f6eyGik9TQKbGdEzViv1ubmf7VeasJrP+MySQ1KtvwoTAUxMZnfTYZcITNiaoEyxe2uhI2poszYdEo2BG/15HVo16qe5furSuMmj6MIZ3AOl+BBHRpwB01oAYMRPMMrvDnCeXHenY9la8HJZ07hj5zPH+ujjYs=</latexit><latexit sha1_base64="KmwIREzjRQ4yYecphZP2rZMfohI=">AAAB6nicbZBNS8NAEIYn9avWr6pHL4tF8FSSItRj0YvHivYD2lA220m7dLMJuxuhhP4ELx4U8eov8ua/cdvmoK0vLDy8M8POvEEiuDau++0UNja3tneKu6W9/YPDo/LxSVvHqWLYYrGIVTegGgWX2DLcCOwmCmkUCOwEk9t5vfOESvNYPpppgn5ER5KHnFFjrYdgUBuUK27VXYisg5dDBXI1B+Wv/jBmaYTSMEG17nluYvyMKsOZwFmpn2pMKJvQEfYsShqh9rPFqjNyYZ0hCWNlnzRk4f6eyGik9TQKbGdEzViv1ubmf7VeasJrP+MySQ1KtvwoTAUxMZnfTYZcITNiaoEyxe2uhI2poszYdEo2BG/15HVo16qe5furSuMmj6MIZ3AOl+BBHRpwB01oAYMRPMMrvDnCeXHenY9la8HJZ07hj5zPH+ujjYs=</latexit><latexit sha1_base64="KmwIREzjRQ4yYecphZP2rZMfohI=">AAAB6nicbZBNS8NAEIYn9avWr6pHL4tF8FSSItRj0YvHivYD2lA220m7dLMJuxuhhP4ELx4U8eov8ua/cdvmoK0vLDy8M8POvEEiuDau++0UNja3tneKu6W9/YPDo/LxSVvHqWLYYrGIVTegGgWX2DLcCOwmCmkUCOwEk9t5vfOESvNYPpppgn5ER5KHnFFjrYdgUBuUK27VXYisg5dDBXI1B+Wv/jBmaYTSMEG17nluYvyMKsOZwFmpn2pMKJvQEfYsShqh9rPFqjNyYZ0hCWNlnzRk4f6eyGik9TQKbGdEzViv1ubmf7VeasJrP+MySQ1KtvwoTAUxMZnfTYZcITNiaoEyxe2uhI2poszYdEo2BG/15HVo16qe5furSuMmj6MIZ3AOl+BBHRpwB01oAYMRPMMrvDnCeXHenY9la8HJZ07hj5zPH+ujjYs=</latexit>

w1<latexit sha1_base64="ozSIzVA/SGXegmac4XRXthOpvw0=">AAAB6nicbZBNS8NAEIYn9avWr6pHL4tF8FQSEeqx6MVjRfsBbSib7aZdutmE3YlSQn+CFw+KePUXefPfuG1z0NYXFh7emWFn3iCRwqDrfjuFtfWNza3idmlnd2//oHx41DJxqhlvsljGuhNQw6VQvIkCJe8kmtMokLwdjG9m9fYj10bE6gEnCfcjOlQiFIyite6f+l6/XHGr7lxkFbwcKpCr0S9/9QYxSyOukElqTNdzE/QzqlEwyaelXmp4QtmYDnnXoqIRN342X3VKzqwzIGGs7VNI5u7viYxGxkyiwHZGFEdmuTYz/6t1Uwyv/EyoJEWu2OKjMJUEYzK7mwyE5gzlxAJlWthdCRtRTRnadEo2BG/55FVoXVQ9y3eXlfp1HkcRTuAUzsGDGtThFhrQBAZDeIZXeHOk8+K8Ox+L1oKTzxzDHzmfPwosjZ8=</latexit><latexit sha1_base64="ozSIzVA/SGXegmac4XRXthOpvw0=">AAAB6nicbZBNS8NAEIYn9avWr6pHL4tF8FQSEeqx6MVjRfsBbSib7aZdutmE3YlSQn+CFw+KePUXefPfuG1z0NYXFh7emWFn3iCRwqDrfjuFtfWNza3idmlnd2//oHx41DJxqhlvsljGuhNQw6VQvIkCJe8kmtMokLwdjG9m9fYj10bE6gEnCfcjOlQiFIyite6f+l6/XHGr7lxkFbwcKpCr0S9/9QYxSyOukElqTNdzE/QzqlEwyaelXmp4QtmYDnnXoqIRN342X3VKzqwzIGGs7VNI5u7viYxGxkyiwHZGFEdmuTYz/6t1Uwyv/EyoJEWu2OKjMJUEYzK7mwyE5gzlxAJlWthdCRtRTRnadEo2BG/55FVoXVQ9y3eXlfp1HkcRTuAUzsGDGtThFhrQBAZDeIZXeHOk8+K8Ox+L1oKTzxzDHzmfPwosjZ8=</latexit><latexit sha1_base64="ozSIzVA/SGXegmac4XRXthOpvw0=">AAAB6nicbZBNS8NAEIYn9avWr6pHL4tF8FQSEeqx6MVjRfsBbSib7aZdutmE3YlSQn+CFw+KePUXefPfuG1z0NYXFh7emWFn3iCRwqDrfjuFtfWNza3idmlnd2//oHx41DJxqhlvsljGuhNQw6VQvIkCJe8kmtMokLwdjG9m9fYj10bE6gEnCfcjOlQiFIyite6f+l6/XHGr7lxkFbwcKpCr0S9/9QYxSyOukElqTNdzE/QzqlEwyaelXmp4QtmYDnnXoqIRN342X3VKzqwzIGGs7VNI5u7viYxGxkyiwHZGFEdmuTYz/6t1Uwyv/EyoJEWu2OKjMJUEYzK7mwyE5gzlxAJlWthdCRtRTRnadEo2BG/55FVoXVQ9y3eXlfp1HkcRTuAUzsGDGtThFhrQBAZDeIZXeHOk8+K8Ox+L1oKTzxzDHzmfPwosjZ8=</latexit><latexit sha1_base64="ozSIzVA/SGXegmac4XRXthOpvw0=">AAAB6nicbZBNS8NAEIYn9avWr6pHL4tF8FQSEeqx6MVjRfsBbSib7aZdutmE3YlSQn+CFw+KePUXefPfuG1z0NYXFh7emWFn3iCRwqDrfjuFtfWNza3idmlnd2//oHx41DJxqhlvsljGuhNQw6VQvIkCJe8kmtMokLwdjG9m9fYj10bE6gEnCfcjOlQiFIyite6f+l6/XHGr7lxkFbwcKpCr0S9/9QYxSyOukElqTNdzE/QzqlEwyaelXmp4QtmYDnnXoqIRN342X3VKzqwzIGGs7VNI5u7viYxGxkyiwHZGFEdmuTYz/6t1Uwyv/EyoJEWu2OKjMJUEYzK7mwyE5gzlxAJlWthdCRtRTRnadEo2BG/55FVoXVQ9y3eXlfp1HkcRTuAUzsGDGtThFhrQBAZDeIZXeHOk8+K8Ox+L1oKTzxzDHzmfPwosjZ8=</latexit>

w2<latexit sha1_base64="sAAe226MpFncoK5AcSpzzUnkA9I=">AAAB6nicbZBNS8NAEIYn9avWr6pHL4tF8FSSIuix6MVjRfsBbSib7aZdutmE3YlSQn+CFw+KePUXefPfuG1z0NYXFh7emWFn3iCRwqDrfjuFtfWNza3idmlnd2//oHx41DJxqhlvsljGuhNQw6VQvIkCJe8kmtMokLwdjG9m9fYj10bE6gEnCfcjOlQiFIyite6f+rV+ueJW3bnIKng5VCBXo1/+6g1ilkZcIZPUmK7nJuhnVKNgkk9LvdTwhLIxHfKuRUUjbvxsvuqUnFlnQMJY26eQzN3fExmNjJlEge2MKI7Mcm1m/lfrphhe+ZlQSYpcscVHYSoJxmR2NxkIzRnKiQXKtLC7EjaimjK06ZRsCN7yyavQqlU9y3cXlfp1HkcRTuAUzsGDS6jDLTSgCQyG8Ayv8OZI58V5dz4WrQUnnzmGP3I+fwALsI2g</latexit><latexit sha1_base64="sAAe226MpFncoK5AcSpzzUnkA9I=">AAAB6nicbZBNS8NAEIYn9avWr6pHL4tF8FSSIuix6MVjRfsBbSib7aZdutmE3YlSQn+CFw+KePUXefPfuG1z0NYXFh7emWFn3iCRwqDrfjuFtfWNza3idmlnd2//oHx41DJxqhlvsljGuhNQw6VQvIkCJe8kmtMokLwdjG9m9fYj10bE6gEnCfcjOlQiFIyite6f+rV+ueJW3bnIKng5VCBXo1/+6g1ilkZcIZPUmK7nJuhnVKNgkk9LvdTwhLIxHfKuRUUjbvxsvuqUnFlnQMJY26eQzN3fExmNjJlEge2MKI7Mcm1m/lfrphhe+ZlQSYpcscVHYSoJxmR2NxkIzRnKiQXKtLC7EjaimjK06ZRsCN7yyavQqlU9y3cXlfp1HkcRTuAUzsGDS6jDLTSgCQyG8Ayv8OZI58V5dz4WrQUnnzmGP3I+fwALsI2g</latexit><latexit sha1_base64="sAAe226MpFncoK5AcSpzzUnkA9I=">AAAB6nicbZBNS8NAEIYn9avWr6pHL4tF8FSSIuix6MVjRfsBbSib7aZdutmE3YlSQn+CFw+KePUXefPfuG1z0NYXFh7emWFn3iCRwqDrfjuFtfWNza3idmlnd2//oHx41DJxqhlvsljGuhNQw6VQvIkCJe8kmtMokLwdjG9m9fYj10bE6gEnCfcjOlQiFIyite6f+rV+ueJW3bnIKng5VCBXo1/+6g1ilkZcIZPUmK7nJuhnVKNgkk9LvdTwhLIxHfKuRUUjbvxsvuqUnFlnQMJY26eQzN3fExmNjJlEge2MKI7Mcm1m/lfrphhe+ZlQSYpcscVHYSoJxmR2NxkIzRnKiQXKtLC7EjaimjK06ZRsCN7yyavQqlU9y3cXlfp1HkcRTuAUzsGDS6jDLTSgCQyG8Ayv8OZI58V5dz4WrQUnnzmGP3I+fwALsI2g</latexit><latexit sha1_base64="sAAe226MpFncoK5AcSpzzUnkA9I=">AAAB6nicbZBNS8NAEIYn9avWr6pHL4tF8FSSIuix6MVjRfsBbSib7aZdutmE3YlSQn+CFw+KePUXefPfuG1z0NYXFh7emWFn3iCRwqDrfjuFtfWNza3idmlnd2//oHx41DJxqhlvsljGuhNQw6VQvIkCJe8kmtMokLwdjG9m9fYj10bE6gEnCfcjOlQiFIyite6f+rV+ueJW3bnIKng5VCBXo1/+6g1ilkZcIZPUmK7nJuhnVKNgkk9LvdTwhLIxHfKuRUUjbvxsvuqUnFlnQMJY26eQzN3fExmNjJlEge2MKI7Mcm1m/lfrphhe+ZlQSYpcscVHYSoJxmR2NxkIzRnKiQXKtLC7EjaimjK06ZRsCN7yyavQqlU9y3cXlfp1HkcRTuAUzsGDS6jDLTSgCQyG8Ayv8OZI58V5dz4WrQUnnzmGP3I+fwALsI2g</latexit>

...

ResNet-34

7x7 AvgPoolstride=1

wm<latexit sha1_base64="3SltFZgdSbEccduFdJMJ4sVJM+s=">AAAB6nicbZBNSwMxEIZn61etX1WPXoJF8FR2RdBj0YvHirYW2qVk07QNTbJLMquUpT/BiwdFvPqLvPlvTNs9aOsLgYd3ZsjMGyVSWPT9b6+wsrq2vlHcLG1t7+zulfcPmjZODeMNFsvYtCJquRSaN1Cg5K3EcKoiyR+i0fW0/vDIjRWxvsdxwkNFB1r0BaPorLunruqWK37Vn4ksQ5BDBXLVu+WvTi9mqeIamaTWtgM/wTCjBgWTfFLqpJYnlI3ogLcdaqq4DbPZqhNy4pwe6cfGPY1k5v6eyKiydqwi16koDu1ibWr+V2un2L8MM6GTFLlm84/6qSQYk+ndpCcMZyjHDigzwu1K2JAaytClU3IhBIsnL0PzrBo4vj2v1K7yOIpwBMdwCgFcQA1uoA4NYDCAZ3iFN096L9679zFvLXj5zCH8kff5A2Ucjds=</latexit><latexit sha1_base64="3SltFZgdSbEccduFdJMJ4sVJM+s=">AAAB6nicbZBNSwMxEIZn61etX1WPXoJF8FR2RdBj0YvHirYW2qVk07QNTbJLMquUpT/BiwdFvPqLvPlvTNs9aOsLgYd3ZsjMGyVSWPT9b6+wsrq2vlHcLG1t7+zulfcPmjZODeMNFsvYtCJquRSaN1Cg5K3EcKoiyR+i0fW0/vDIjRWxvsdxwkNFB1r0BaPorLunruqWK37Vn4ksQ5BDBXLVu+WvTi9mqeIamaTWtgM/wTCjBgWTfFLqpJYnlI3ogLcdaqq4DbPZqhNy4pwe6cfGPY1k5v6eyKiydqwi16koDu1ibWr+V2un2L8MM6GTFLlm84/6qSQYk+ndpCcMZyjHDigzwu1K2JAaytClU3IhBIsnL0PzrBo4vj2v1K7yOIpwBMdwCgFcQA1uoA4NYDCAZ3iFN096L9679zFvLXj5zCH8kff5A2Ucjds=</latexit><latexit sha1_base64="3SltFZgdSbEccduFdJMJ4sVJM+s=">AAAB6nicbZBNSwMxEIZn61etX1WPXoJF8FR2RdBj0YvHirYW2qVk07QNTbJLMquUpT/BiwdFvPqLvPlvTNs9aOsLgYd3ZsjMGyVSWPT9b6+wsrq2vlHcLG1t7+zulfcPmjZODeMNFsvYtCJquRSaN1Cg5K3EcKoiyR+i0fW0/vDIjRWxvsdxwkNFB1r0BaPorLunruqWK37Vn4ksQ5BDBXLVu+WvTi9mqeIamaTWtgM/wTCjBgWTfFLqpJYnlI3ogLcdaqq4DbPZqhNy4pwe6cfGPY1k5v6eyKiydqwi16koDu1ibWr+V2un2L8MM6GTFLlm84/6qSQYk+ndpCcMZyjHDigzwu1K2JAaytClU3IhBIsnL0PzrBo4vj2v1K7yOIpwBMdwCgFcQA1uoA4NYDCAZ3iFN096L9679zFvLXj5zCH8kff5A2Ucjds=</latexit><latexit sha1_base64="3SltFZgdSbEccduFdJMJ4sVJM+s=">AAAB6nicbZBNSwMxEIZn61etX1WPXoJF8FR2RdBj0YvHirYW2qVk07QNTbJLMquUpT/BiwdFvPqLvPlvTNs9aOsLgYd3ZsjMGyVSWPT9b6+wsrq2vlHcLG1t7+zulfcPmjZODeMNFsvYtCJquRSaN1Cg5K3EcKoiyR+i0fW0/vDIjRWxvsdxwkNFB1r0BaPorLunruqWK37Vn4ksQ5BDBXLVu+WvTi9mqeIamaTWtgM/wTCjBgWTfFLqpJYnlI3ogLcdaqq4DbPZqhNy4pwe6cfGPY1k5v6eyKiydqwi16koDu1ibWr+V2un2L8MM6GTFLlm84/6qSQYk+ndpCcMZyjHDigzwu1K2JAaytClU3IhBIsnL0PzrBo4vj2v1K7yOIpwBMdwCgFcQA1uoA4NYDCAZ3iFN096L9679zFvLXj5zCH8kff5A2Ucjds=</latexit>

{ <latexit sha1_base64="tFROaEGhmnnMpUlcEPFjR2z/BNI=">AAAB6XicbZBNS8NAEIYn9avWr6hHL4tF8FQSEfRY9OKxiv2ANpTNdtIu3WzC7kYoof/AiwdFvPqPvPlv3LY5aOsLCw/vzLAzb5gKro3nfTultfWNza3ydmVnd2//wD08aukkUwybLBGJ6oRUo+ASm4YbgZ1UIY1Dge1wfDurt59QaZ7IRzNJMYjpUPKIM2qs9dDL+27Vq3lzkVXwC6hCoUbf/eoNEpbFKA0TVOuu76UmyKkynAmcVnqZxpSyMR1i16KkMeogn286JWfWGZAoUfZJQ+bu74mcxlpP4tB2xtSM9HJtZv5X62Ymug5yLtPMoGSLj6JMEJOQ2dlkwBUyIyYWKFPc7krYiCrKjA2nYkPwl09ehdZFzbd8f1mt3xRxlOEETuEcfLiCOtxBA5rAIIJneIU3Z+y8OO/Ox6K15BQzx/BHzucPm4aNZQ==</latexit><latexit sha1_base64="tFROaEGhmnnMpUlcEPFjR2z/BNI=">AAAB6XicbZBNS8NAEIYn9avWr6hHL4tF8FQSEfRY9OKxiv2ANpTNdtIu3WzC7kYoof/AiwdFvPqPvPlv3LY5aOsLCw/vzLAzb5gKro3nfTultfWNza3ydmVnd2//wD08aukkUwybLBGJ6oRUo+ASm4YbgZ1UIY1Dge1wfDurt59QaZ7IRzNJMYjpUPKIM2qs9dDL+27Vq3lzkVXwC6hCoUbf/eoNEpbFKA0TVOuu76UmyKkynAmcVnqZxpSyMR1i16KkMeogn286JWfWGZAoUfZJQ+bu74mcxlpP4tB2xtSM9HJtZv5X62Ymug5yLtPMoGSLj6JMEJOQ2dlkwBUyIyYWKFPc7krYiCrKjA2nYkPwl09ehdZFzbd8f1mt3xRxlOEETuEcfLiCOtxBA5rAIIJneIU3Z+y8OO/Ox6K15BQzx/BHzucPm4aNZQ==</latexit><latexit sha1_base64="tFROaEGhmnnMpUlcEPFjR2z/BNI=">AAAB6XicbZBNS8NAEIYn9avWr6hHL4tF8FQSEfRY9OKxiv2ANpTNdtIu3WzC7kYoof/AiwdFvPqPvPlv3LY5aOsLCw/vzLAzb5gKro3nfTultfWNza3ydmVnd2//wD08aukkUwybLBGJ6oRUo+ASm4YbgZ1UIY1Dge1wfDurt59QaZ7IRzNJMYjpUPKIM2qs9dDL+27Vq3lzkVXwC6hCoUbf/eoNEpbFKA0TVOuu76UmyKkynAmcVnqZxpSyMR1i16KkMeogn286JWfWGZAoUfZJQ+bu74mcxlpP4tB2xtSM9HJtZv5X62Ymug5yLtPMoGSLj6JMEJOQ2dlkwBUyIyYWKFPc7krYiCrKjA2nYkPwl09ehdZFzbd8f1mt3xRxlOEETuEcfLiCOtxBA5rAIIJneIU3Z+y8OO/Ox6K15BQzx/BHzucPm4aNZQ==</latexit><latexit sha1_base64="tFROaEGhmnnMpUlcEPFjR2z/BNI=">AAAB6XicbZBNS8NAEIYn9avWr6hHL4tF8FQSEfRY9OKxiv2ANpTNdtIu3WzC7kYoof/AiwdFvPqPvPlv3LY5aOsLCw/vzLAzb5gKro3nfTultfWNza3ydmVnd2//wD08aukkUwybLBGJ6oRUo+ASm4YbgZ1UIY1Dge1wfDurt59QaZ7IRzNJMYjpUPKIM2qs9dDL+27Vq3lzkVXwC6hCoUbf/eoNEpbFKA0TVOuu76UmyKkynAmcVnqZxpSyMR1i16KkMeogn286JWfWGZAoUfZJQ+bu74mcxlpP4tB2xtSM9HJtZv5X62Ymug5yLtPMoGSLj6JMEJOQ2dlkwBUyIyYWKFPc7krYiCrKjA2nYkPwl09ehdZFzbd8f1mt3xRxlOEETuEcfLiCOtxBA5rAIIJneIU3Z+y8OO/Ox6K15BQzx/BHzucPm4aNZQ==</latexit>

K � 1<latexit sha1_base64="r9EyNDc4ZkK0RbMq+OXc0Let6lU=">AAAB6nicbZBNS8NAEIYn9avWr6pHL4tF8GJJRNBj0YvgpaL9gDaUzXbSLt1swu5GKKE/wYsHRbz6i7z5b9y2OWjrCwsP78ywM2+QCK6N6347hZXVtfWN4mZpa3tnd6+8f9DUcaoYNlgsYtUOqEbBJTYMNwLbiUIaBQJbwehmWm89odI8lo9mnKAf0YHkIWfUWOvh7szrlStu1Z2JLIOXQwVy1Xvlr24/ZmmE0jBBte54bmL8jCrDmcBJqZtqTCgb0QF2LEoaofaz2aoTcmKdPgljZZ80ZOb+nshopPU4CmxnRM1QL9am5n+1TmrCKz/jMkkNSjb/KEwFMTGZ3k36XCEzYmyBMsXtroQNqaLM2HRKNgRv8eRlaJ5XPcv3F5XadR5HEY7gGE7Bg0uowS3UoQEMBvAMr/DmCOfFeXc+5q0FJ585hD9yPn8AexuNQQ==</latexit><latexit sha1_base64="r9EyNDc4ZkK0RbMq+OXc0Let6lU=">AAAB6nicbZBNS8NAEIYn9avWr6pHL4tF8GJJRNBj0YvgpaL9gDaUzXbSLt1swu5GKKE/wYsHRbz6i7z5b9y2OWjrCwsP78ywM2+QCK6N6347hZXVtfWN4mZpa3tnd6+8f9DUcaoYNlgsYtUOqEbBJTYMNwLbiUIaBQJbwehmWm89odI8lo9mnKAf0YHkIWfUWOvh7szrlStu1Z2JLIOXQwVy1Xvlr24/ZmmE0jBBte54bmL8jCrDmcBJqZtqTCgb0QF2LEoaofaz2aoTcmKdPgljZZ80ZOb+nshopPU4CmxnRM1QL9am5n+1TmrCKz/jMkkNSjb/KEwFMTGZ3k36XCEzYmyBMsXtroQNqaLM2HRKNgRv8eRlaJ5XPcv3F5XadR5HEY7gGE7Bg0uowS3UoQEMBvAMr/DmCOfFeXc+5q0FJ585hD9yPn8AexuNQQ==</latexit><latexit sha1_base64="r9EyNDc4ZkK0RbMq+OXc0Let6lU=">AAAB6nicbZBNS8NAEIYn9avWr6pHL4tF8GJJRNBj0YvgpaL9gDaUzXbSLt1swu5GKKE/wYsHRbz6i7z5b9y2OWjrCwsP78ywM2+QCK6N6347hZXVtfWN4mZpa3tnd6+8f9DUcaoYNlgsYtUOqEbBJTYMNwLbiUIaBQJbwehmWm89odI8lo9mnKAf0YHkIWfUWOvh7szrlStu1Z2JLIOXQwVy1Xvlr24/ZmmE0jBBte54bmL8jCrDmcBJqZtqTCgb0QF2LEoaofaz2aoTcmKdPgljZZ80ZOb+nshopPU4CmxnRM1QL9am5n+1TmrCKz/jMkkNSjb/KEwFMTGZ3k36XCEzYmyBMsXtroQNqaLM2HRKNgRv8eRlaJ5XPcv3F5XadR5HEY7gGE7Bg0uowS3UoQEMBvAMr/DmCOfFeXc+5q0FJ585hD9yPn8AexuNQQ==</latexit><latexit sha1_base64="r9EyNDc4ZkK0RbMq+OXc0Let6lU=">AAAB6nicbZBNS8NAEIYn9avWr6pHL4tF8GJJRNBj0YvgpaL9gDaUzXbSLt1swu5GKKE/wYsHRbz6i7z5b9y2OWjrCwsP78ywM2+QCK6N6347hZXVtfWN4mZpa3tnd6+8f9DUcaoYNlgsYtUOqEbBJTYMNwLbiUIaBQJbwehmWm89odI8lo9mnKAf0YHkIWfUWOvh7szrlStu1Z2JLIOXQwVy1Xvlr24/ZmmE0jBBte54bmL8jCrDmcBJqZtqTCgb0QF2LEoaofaz2aoTcmKdPgljZZ80ZOb+nshopPU4CmxnRM1QL9am5n+1TmrCKz/jMkkNSjb/KEwFMTGZ3k36XCEzYmyBMsXtroQNqaLM2HRKNgRv8eRlaJ5XPcv3F5XadR5HEY7gGE7Bg0uowS3UoQEMBvAMr/DmCOfFeXc+5q0FJ585hD9yPn8AexuNQQ==</latexit>

bK�1<latexit sha1_base64="LwOEvZG6tsrLM9p/zFnxWW88ucU=">AAAB7nicbZBNS8NAEIYn9avWr6pHL4tF8GJJpKDHohfBSwX7AW0om+2mXbrZhN2JUEJ/hBcPinj193jz37htc9DWFxYe3plhZ94gkcKg6347hbX1jc2t4nZpZ3dv/6B8eNQycaoZb7JYxroTUMOlULyJAiXvJJrTKJC8HYxvZ/X2E9dGxOoRJwn3IzpUIhSMorXaQT+7v/Cm/XLFrbpzkVXwcqhArka//NUbxCyNuEImqTFdz03Qz6hGwSSflnqp4QllYzrkXYuKRtz42XzdKTmzzoCEsbZPIZm7vycyGhkziQLbGVEcmeXazPyv1k0xvPYzoZIUuWKLj8JUEozJ7HYyEJozlBMLlGlhdyVsRDVlaBMq2RC85ZNXoXVZ9Sw/1Cr1mzyOIpzAKZyDB1dQhztoQBMYjOEZXuHNSZwX5935WLQWnHzmGP7I+fwBshSPIg==</latexit><latexit sha1_base64="LwOEvZG6tsrLM9p/zFnxWW88ucU=">AAAB7nicbZBNS8NAEIYn9avWr6pHL4tF8GJJpKDHohfBSwX7AW0om+2mXbrZhN2JUEJ/hBcPinj193jz37htc9DWFxYe3plhZ94gkcKg6347hbX1jc2t4nZpZ3dv/6B8eNQycaoZb7JYxroTUMOlULyJAiXvJJrTKJC8HYxvZ/X2E9dGxOoRJwn3IzpUIhSMorXaQT+7v/Cm/XLFrbpzkVXwcqhArka//NUbxCyNuEImqTFdz03Qz6hGwSSflnqp4QllYzrkXYuKRtz42XzdKTmzzoCEsbZPIZm7vycyGhkziQLbGVEcmeXazPyv1k0xvPYzoZIUuWKLj8JUEozJ7HYyEJozlBMLlGlhdyVsRDVlaBMq2RC85ZNXoXVZ9Sw/1Cr1mzyOIpzAKZyDB1dQhztoQBMYjOEZXuHNSZwX5935WLQWnHzmGP7I+fwBshSPIg==</latexit><latexit sha1_base64="LwOEvZG6tsrLM9p/zFnxWW88ucU=">AAAB7nicbZBNS8NAEIYn9avWr6pHL4tF8GJJpKDHohfBSwX7AW0om+2mXbrZhN2JUEJ/hBcPinj193jz37htc9DWFxYe3plhZ94gkcKg6347hbX1jc2t4nZpZ3dv/6B8eNQycaoZb7JYxroTUMOlULyJAiXvJJrTKJC8HYxvZ/X2E9dGxOoRJwn3IzpUIhSMorXaQT+7v/Cm/XLFrbpzkVXwcqhArka//NUbxCyNuEImqTFdz03Qz6hGwSSflnqp4QllYzrkXYuKRtz42XzdKTmzzoCEsbZPIZm7vycyGhkziQLbGVEcmeXazPyv1k0xvPYzoZIUuWKLj8JUEozJ7HYyEJozlBMLlGlhdyVsRDVlaBMq2RC85ZNXoXVZ9Sw/1Cr1mzyOIpzAKZyDB1dQhztoQBMYjOEZXuHNSZwX5935WLQWnHzmGP7I+fwBshSPIg==</latexit><latexit sha1_base64="LwOEvZG6tsrLM9p/zFnxWW88ucU=">AAAB7nicbZBNS8NAEIYn9avWr6pHL4tF8GJJpKDHohfBSwX7AW0om+2mXbrZhN2JUEJ/hBcPinj193jz37htc9DWFxYe3plhZ94gkcKg6347hbX1jc2t4nZpZ3dv/6B8eNQycaoZb7JYxroTUMOlULyJAiXvJJrTKJC8HYxvZ/X2E9dGxOoRJwn3IzpUIhSMorXaQT+7v/Cm/XLFrbpzkVXwcqhArka//NUbxCyNuEImqTFdz03Qz6hGwSSflnqp4QllYzrkXYuKRtz42XzdKTmzzoCEsbZPIZm7vycyGhkziQLbGVEcmeXazPyv1k0xvPYzoZIUuWKLj8JUEozJ7HYyEJozlBMLlGlhdyVsRDVlaBMq2RC85ZNXoXVZ9Sw/1Cr1mzyOIpzAKZyDB1dQhztoQBMYjOEZXuHNSZwX5935WLQWnHzmGP7I+fwBshSPIg==</latexit>

bP (yi > r1)<latexit sha1_base64="JEWZgW/c9FbzIeNeK3w5m6yuKL4=">AAAB/XicbZDLSsNAFIYn9VbrLV52boJFqJuSiKArKbpxWcFeoA1hMjlph04mYWaixFB8FTcuFHHre7jzbZy2WWjrDwMf/zmHc+b3E0alsu1vo7S0vLK6Vl6vbGxube+Yu3ttGaeCQIvELBZdH0tglENLUcWgmwjAkc+g44+uJ/XOPQhJY36nsgTcCA84DSnBSlueedB/oAEMscqb41rm0UvhOSeeWbXr9lTWIjgFVFGhpmd+9YOYpBFwRRiWsufYiXJzLBQlDMaVfiohwWSEB9DTyHEE0s2n14+tY+0EVhgL/biypu7viRxHUmaRrzsjrIZyvjYx/6v1UhVeuDnlSaqAk9miMGWWiq1JFFZABRDFMg2YCKpvtcgQC0yUDqyiQ3Dmv7wI7dO6o/n2rNq4KuIoo0N0hGrIQeeogW5QE7UQQY/oGb2iN+PJeDHejY9Za8koZvbRHxmfP9pzlM8=</latexit><latexit sha1_base64="JEWZgW/c9FbzIeNeK3w5m6yuKL4=">AAAB/XicbZDLSsNAFIYn9VbrLV52boJFqJuSiKArKbpxWcFeoA1hMjlph04mYWaixFB8FTcuFHHre7jzbZy2WWjrDwMf/zmHc+b3E0alsu1vo7S0vLK6Vl6vbGxube+Yu3ttGaeCQIvELBZdH0tglENLUcWgmwjAkc+g44+uJ/XOPQhJY36nsgTcCA84DSnBSlueedB/oAEMscqb41rm0UvhOSeeWbXr9lTWIjgFVFGhpmd+9YOYpBFwRRiWsufYiXJzLBQlDMaVfiohwWSEB9DTyHEE0s2n14+tY+0EVhgL/biypu7viRxHUmaRrzsjrIZyvjYx/6v1UhVeuDnlSaqAk9miMGWWiq1JFFZABRDFMg2YCKpvtcgQC0yUDqyiQ3Dmv7wI7dO6o/n2rNq4KuIoo0N0hGrIQeeogW5QE7UQQY/oGb2iN+PJeDHejY9Za8koZvbRHxmfP9pzlM8=</latexit><latexit sha1_base64="JEWZgW/c9FbzIeNeK3w5m6yuKL4=">AAAB/XicbZDLSsNAFIYn9VbrLV52boJFqJuSiKArKbpxWcFeoA1hMjlph04mYWaixFB8FTcuFHHre7jzbZy2WWjrDwMf/zmHc+b3E0alsu1vo7S0vLK6Vl6vbGxube+Yu3ttGaeCQIvELBZdH0tglENLUcWgmwjAkc+g44+uJ/XOPQhJY36nsgTcCA84DSnBSlueedB/oAEMscqb41rm0UvhOSeeWbXr9lTWIjgFVFGhpmd+9YOYpBFwRRiWsufYiXJzLBQlDMaVfiohwWSEB9DTyHEE0s2n14+tY+0EVhgL/biypu7viRxHUmaRrzsjrIZyvjYx/6v1UhVeuDnlSaqAk9miMGWWiq1JFFZABRDFMg2YCKpvtcgQC0yUDqyiQ3Dmv7wI7dO6o/n2rNq4KuIoo0N0hGrIQeeogW5QE7UQQY/oGb2iN+PJeDHejY9Za8koZvbRHxmfP9pzlM8=</latexit><latexit sha1_base64="JEWZgW/c9FbzIeNeK3w5m6yuKL4=">AAAB/XicbZDLSsNAFIYn9VbrLV52boJFqJuSiKArKbpxWcFeoA1hMjlph04mYWaixFB8FTcuFHHre7jzbZy2WWjrDwMf/zmHc+b3E0alsu1vo7S0vLK6Vl6vbGxube+Yu3ttGaeCQIvELBZdH0tglENLUcWgmwjAkc+g44+uJ/XOPQhJY36nsgTcCA84DSnBSlueedB/oAEMscqb41rm0UvhOSeeWbXr9lTWIjgFVFGhpmd+9YOYpBFwRRiWsufYiXJzLBQlDMaVfiohwWSEB9DTyHEE0s2n14+tY+0EVhgL/biypu7viRxHUmaRrzsjrIZyvjYx/6v1UhVeuDnlSaqAk9miMGWWiq1JFFZABRDFMg2YCKpvtcgQC0yUDqyiQ3Dmv7wI7dO6o/n2rNq4KuIoo0N0hGrIQeeogW5QE7UQQY/oGb2iN+PJeDHejY9Za8koZvbRHxmfP9pzlM8=</latexit>

bP (yi > r2)<latexit sha1_base64="TTPnmRaLx/eL+eApEw1jpI9oVN4=">AAAB/XicbZDLSsNAFIZPvNZ6i5edm2AR6qYkRdCVFN24rGAv0IYwmUzaoZNJmJkoMRRfxY0LRdz6Hu58G6dtFtr6w8DHf87hnPn9hFGpbPvbWFpeWV1bL22UN7e2d3bNvf22jFOBSQvHLBZdH0nCKCctRRUj3UQQFPmMdPzR9aTeuSdC0pjfqSwhboQGnIYUI6UtzzzsP9CADJHKm+Nq5tFL4dVPPbNi1+yprEVwCqhAoaZnfvWDGKcR4QozJGXPsRPl5kgoihkZl/upJAnCIzQgPY0cRUS6+fT6sXWincAKY6EfV9bU/T2Ro0jKLPJ1Z4TUUM7XJuZ/tV6qwgs3pzxJFeF4tihMmaViaxKFFVBBsGKZBoQF1bdaeIgEwkoHVtYhOPNfXoR2veZovj2rNK6KOEpwBMdQBQfOoQE30IQWYHiEZ3iFN+PJeDHejY9Z65JRzBzAHxmfP9v4lNA=</latexit><latexit sha1_base64="TTPnmRaLx/eL+eApEw1jpI9oVN4=">AAAB/XicbZDLSsNAFIZPvNZ6i5edm2AR6qYkRdCVFN24rGAv0IYwmUzaoZNJmJkoMRRfxY0LRdz6Hu58G6dtFtr6w8DHf87hnPn9hFGpbPvbWFpeWV1bL22UN7e2d3bNvf22jFOBSQvHLBZdH0nCKCctRRUj3UQQFPmMdPzR9aTeuSdC0pjfqSwhboQGnIYUI6UtzzzsP9CADJHKm+Nq5tFL4dVPPbNi1+yprEVwCqhAoaZnfvWDGKcR4QozJGXPsRPl5kgoihkZl/upJAnCIzQgPY0cRUS6+fT6sXWincAKY6EfV9bU/T2Ro0jKLPJ1Z4TUUM7XJuZ/tV6qwgs3pzxJFeF4tihMmaViaxKFFVBBsGKZBoQF1bdaeIgEwkoHVtYhOPNfXoR2veZovj2rNK6KOEpwBMdQBQfOoQE30IQWYHiEZ3iFN+PJeDHejY9Z65JRzBzAHxmfP9v4lNA=</latexit><latexit sha1_base64="TTPnmRaLx/eL+eApEw1jpI9oVN4=">AAAB/XicbZDLSsNAFIZPvNZ6i5edm2AR6qYkRdCVFN24rGAv0IYwmUzaoZNJmJkoMRRfxY0LRdz6Hu58G6dtFtr6w8DHf87hnPn9hFGpbPvbWFpeWV1bL22UN7e2d3bNvf22jFOBSQvHLBZdH0nCKCctRRUj3UQQFPmMdPzR9aTeuSdC0pjfqSwhboQGnIYUI6UtzzzsP9CADJHKm+Nq5tFL4dVPPbNi1+yprEVwCqhAoaZnfvWDGKcR4QozJGXPsRPl5kgoihkZl/upJAnCIzQgPY0cRUS6+fT6sXWincAKY6EfV9bU/T2Ro0jKLPJ1Z4TUUM7XJuZ/tV6qwgs3pzxJFeF4tihMmaViaxKFFVBBsGKZBoQF1bdaeIgEwkoHVtYhOPNfXoR2veZovj2rNK6KOEpwBMdQBQfOoQE30IQWYHiEZ3iFN+PJeDHejY9Z65JRzBzAHxmfP9v4lNA=</latexit><latexit sha1_base64="TTPnmRaLx/eL+eApEw1jpI9oVN4=">AAAB/XicbZDLSsNAFIZPvNZ6i5edm2AR6qYkRdCVFN24rGAv0IYwmUzaoZNJmJkoMRRfxY0LRdz6Hu58G6dtFtr6w8DHf87hnPn9hFGpbPvbWFpeWV1bL22UN7e2d3bNvf22jFOBSQvHLBZdH0nCKCctRRUj3UQQFPmMdPzR9aTeuSdC0pjfqSwhboQGnIYUI6UtzzzsP9CADJHKm+Nq5tFL4dVPPbNi1+yprEVwCqhAoaZnfvWDGKcR4QozJGXPsRPl5kgoihkZl/upJAnCIzQgPY0cRUS6+fT6sXWincAKY6EfV9bU/T2Ro0jKLPJ1Z4TUUM7XJuZ/tV6qwgs3pzxJFeF4tihMmaViaxKFFVBBsGKZBoQF1bdaeIgEwkoHVtYhOPNfXoR2veZovj2rNK6KOEpwBMdQBQfOoQE30IQWYHiEZ3iFN+PJeDHejY9Z65JRzBzAHxmfP9v4lNA=</latexit>

bP (yi > rK�1)<latexit sha1_base64="EsrSGPlu5mWGfLIJjs5E47reN0Y=">AAACAXicbZDLSsNAFIYn9VbrLepGcDNYhLqwJCLoSopuBDcV7AXaECaTSTt0MgkzEyWEuPFV3LhQxK1v4c63cdpmoa0/DHz85xzOnN+LGZXKsr6N0sLi0vJKebWytr6xuWVu77RllAhMWjhikeh6SBJGOWkpqhjpxoKg0GOk442uxvXOPRGSRvxOpTFxQjTgNKAYKW255l7/gfpkiFTWzGupSy+Em90c2/mRa1atujURnAe7gCoo1HTNr74f4SQkXGGGpOzZVqycDAlFMSN5pZ9IEiM8QgPS08hRSKSTTS7I4aF2fBhEQj+u4MT9PZGhUMo09HRniNRQztbG5n+1XqKCcyejPE4U4Xi6KEgYVBEcxwF9KghWLNWAsKD6rxAPkUBY6dAqOgR79uR5aJ/Ubc23p9XGZRFHGeyDA1ADNjgDDXANmqAFMHgEz+AVvBlPxovxbnxMW0tGMbML/sj4/AG94JZn</latexit><latexit sha1_base64="EsrSGPlu5mWGfLIJjs5E47reN0Y=">AAACAXicbZDLSsNAFIYn9VbrLepGcDNYhLqwJCLoSopuBDcV7AXaECaTSTt0MgkzEyWEuPFV3LhQxK1v4c63cdpmoa0/DHz85xzOnN+LGZXKsr6N0sLi0vJKebWytr6xuWVu77RllAhMWjhikeh6SBJGOWkpqhjpxoKg0GOk442uxvXOPRGSRvxOpTFxQjTgNKAYKW255l7/gfpkiFTWzGupSy+Em90c2/mRa1atujURnAe7gCoo1HTNr74f4SQkXGGGpOzZVqycDAlFMSN5pZ9IEiM8QgPS08hRSKSTTS7I4aF2fBhEQj+u4MT9PZGhUMo09HRniNRQztbG5n+1XqKCcyejPE4U4Xi6KEgYVBEcxwF9KghWLNWAsKD6rxAPkUBY6dAqOgR79uR5aJ/Ubc23p9XGZRFHGeyDA1ADNjgDDXANmqAFMHgEz+AVvBlPxovxbnxMW0tGMbML/sj4/AG94JZn</latexit><latexit sha1_base64="EsrSGPlu5mWGfLIJjs5E47reN0Y=">AAACAXicbZDLSsNAFIYn9VbrLepGcDNYhLqwJCLoSopuBDcV7AXaECaTSTt0MgkzEyWEuPFV3LhQxK1v4c63cdpmoa0/DHz85xzOnN+LGZXKsr6N0sLi0vJKebWytr6xuWVu77RllAhMWjhikeh6SBJGOWkpqhjpxoKg0GOk442uxvXOPRGSRvxOpTFxQjTgNKAYKW255l7/gfpkiFTWzGupSy+Em90c2/mRa1atujURnAe7gCoo1HTNr74f4SQkXGGGpOzZVqycDAlFMSN5pZ9IEiM8QgPS08hRSKSTTS7I4aF2fBhEQj+u4MT9PZGhUMo09HRniNRQztbG5n+1XqKCcyejPE4U4Xi6KEgYVBEcxwF9KghWLNWAsKD6rxAPkUBY6dAqOgR79uR5aJ/Ubc23p9XGZRFHGeyDA1ADNjgDDXANmqAFMHgEz+AVvBlPxovxbnxMW0tGMbML/sj4/AG94JZn</latexit><latexit sha1_base64="EsrSGPlu5mWGfLIJjs5E47reN0Y=">AAACAXicbZDLSsNAFIYn9VbrLepGcDNYhLqwJCLoSopuBDcV7AXaECaTSTt0MgkzEyWEuPFV3LhQxK1v4c63cdpmoa0/DHz85xzOnN+LGZXKsr6N0sLi0vJKebWytr6xuWVu77RllAhMWjhikeh6SBJGOWkpqhjpxoKg0GOk442uxvXOPRGSRvxOpTFxQjTgNKAYKW255l7/gfpkiFTWzGupSy+Em90c2/mRa1atujURnAe7gCoo1HTNr74f4SQkXGGGpOzZVqycDAlFMSN5pZ9IEiM8QgPS08hRSKSTTS7I4aF2fBhEQj+u4MT9PZGhUMo09HRniNRQztbG5n+1XqKCcyejPE4U4Xi6KEgYVBEcxwF9KghWLNWAsKD6rxAPkUBY6dAqOgR79uR5aJ/Ubc23p9XGZRFHGeyDA1ADNjgDDXANmqAFMHgEz+AVvBlPxovxbnxMW0tGMbML/sj4/AG94JZn</latexit>

{ <latexit sha1_base64="tFROaEGhmnnMpUlcEPFjR2z/BNI=">AAAB6XicbZBNS8NAEIYn9avWr6hHL4tF8FQSEfRY9OKxiv2ANpTNdtIu3WzC7kYoof/AiwdFvPqPvPlv3LY5aOsLCw/vzLAzb5gKro3nfTultfWNza3ydmVnd2//wD08aukkUwybLBGJ6oRUo+ASm4YbgZ1UIY1Dge1wfDurt59QaZ7IRzNJMYjpUPKIM2qs9dDL+27Vq3lzkVXwC6hCoUbf/eoNEpbFKA0TVOuu76UmyKkynAmcVnqZxpSyMR1i16KkMeogn286JWfWGZAoUfZJQ+bu74mcxlpP4tB2xtSM9HJtZv5X62Ymug5yLtPMoGSLj6JMEJOQ2dlkwBUyIyYWKFPc7krYiCrKjA2nYkPwl09ehdZFzbd8f1mt3xRxlOEETuEcfLiCOtxBA5rAIIJneIU3Z+y8OO/Ox6K15BQzx/BHzucPm4aNZQ==</latexit><latexit sha1_base64="tFROaEGhmnnMpUlcEPFjR2z/BNI=">AAAB6XicbZBNS8NAEIYn9avWr6hHL4tF8FQSEfRY9OKxiv2ANpTNdtIu3WzC7kYoof/AiwdFvPqPvPlv3LY5aOsLCw/vzLAzb5gKro3nfTultfWNza3ydmVnd2//wD08aukkUwybLBGJ6oRUo+ASm4YbgZ1UIY1Dge1wfDurt59QaZ7IRzNJMYjpUPKIM2qs9dDL+27Vq3lzkVXwC6hCoUbf/eoNEpbFKA0TVOuu76UmyKkynAmcVnqZxpSyMR1i16KkMeogn286JWfWGZAoUfZJQ+bu74mcxlpP4tB2xtSM9HJtZv5X62Ymug5yLtPMoGSLj6JMEJOQ2dlkwBUyIyYWKFPc7krYiCrKjA2nYkPwl09ehdZFzbd8f1mt3xRxlOEETuEcfLiCOtxBA5rAIIJneIU3Z+y8OO/Ox6K15BQzx/BHzucPm4aNZQ==</latexit><latexit sha1_base64="tFROaEGhmnnMpUlcEPFjR2z/BNI=">AAAB6XicbZBNS8NAEIYn9avWr6hHL4tF8FQSEfRY9OKxiv2ANpTNdtIu3WzC7kYoof/AiwdFvPqPvPlv3LY5aOsLCw/vzLAzb5gKro3nfTultfWNza3ydmVnd2//wD08aukkUwybLBGJ6oRUo+ASm4YbgZ1UIY1Dge1wfDurt59QaZ7IRzNJMYjpUPKIM2qs9dDL+27Vq3lzkVXwC6hCoUbf/eoNEpbFKA0TVOuu76UmyKkynAmcVnqZxpSyMR1i16KkMeogn286JWfWGZAoUfZJQ+bu74mcxlpP4tB2xtSM9HJtZv5X62Ymug5yLtPMoGSLj6JMEJOQ2dlkwBUyIyYWKFPc7krYiCrKjA2nYkPwl09ehdZFzbd8f1mt3xRxlOEETuEcfLiCOtxBA5rAIIJneIU3Z+y8OO/Ox6K15BQzx/BHzucPm4aNZQ==</latexit><latexit sha1_base64="tFROaEGhmnnMpUlcEPFjR2z/BNI=">AAAB6XicbZBNS8NAEIYn9avWr6hHL4tF8FQSEfRY9OKxiv2ANpTNdtIu3WzC7kYoof/AiwdFvPqPvPlv3LY5aOsLCw/vzLAzb5gKro3nfTultfWNza3ydmVnd2//wD08aukkUwybLBGJ6oRUo+ASm4YbgZ1UIY1Dge1wfDurt59QaZ7IRzNJMYjpUPKIM2qs9dDL+27Vq3lzkVXwC6hCoUbf/eoNEpbFKA0TVOuu76UmyKkynAmcVnqZxpSyMR1i16KkMeogn286JWfWGZAoUfZJQ+bu74mcxlpP4tB2xtSM9HJtZv5X62Ymug5yLtPMoGSLj6JMEJOQ2dlkwBUyIyYWKFPc7krYiCrKjA2nYkPwl09ehdZFzbd8f1mt3xRxlOEETuEcfLiCOtxBA5rAIIJneIU3Z+y8OO/Ox6K15BQzx/BHzucPm4aNZQ==</latexit>

Tasks

{<latexit sha1_base64="tFROaEGhmnnMpUlcEPFjR2z/BNI=">AAAB6XicbZBNS8NAEIYn9avWr6hHL4tF8FQSEfRY9OKxiv2ANpTNdtIu3WzC7kYoof/AiwdFvPqPvPlv3LY5aOsLCw/vzLAzb5gKro3nfTultfWNza3ydmVnd2//wD08aukkUwybLBGJ6oRUo+ASm4YbgZ1UIY1Dge1wfDurt59QaZ7IRzNJMYjpUPKIM2qs9dDL+27Vq3lzkVXwC6hCoUbf/eoNEpbFKA0TVOuu76UmyKkynAmcVnqZxpSyMR1i16KkMeogn286JWfWGZAoUfZJQ+bu74mcxlpP4tB2xtSM9HJtZv5X62Ymug5yLtPMoGSLj6JMEJOQ2dlkwBUyIyYWKFPc7krYiCrKjA2nYkPwl09ehdZFzbd8f1mt3xRxlOEETuEcfLiCOtxBA5rAIIJneIU3Z+y8OO/Ox6K15BQzx/BHzucPm4aNZQ==</latexit><latexit sha1_base64="tFROaEGhmnnMpUlcEPFjR2z/BNI=">AAAB6XicbZBNS8NAEIYn9avWr6hHL4tF8FQSEfRY9OKxiv2ANpTNdtIu3WzC7kYoof/AiwdFvPqPvPlv3LY5aOsLCw/vzLAzb5gKro3nfTultfWNza3ydmVnd2//wD08aukkUwybLBGJ6oRUo+ASm4YbgZ1UIY1Dge1wfDurt59QaZ7IRzNJMYjpUPKIM2qs9dDL+27Vq3lzkVXwC6hCoUbf/eoNEpbFKA0TVOuu76UmyKkynAmcVnqZxpSyMR1i16KkMeogn286JWfWGZAoUfZJQ+bu74mcxlpP4tB2xtSM9HJtZv5X62Ymug5yLtPMoGSLj6JMEJOQ2dlkwBUyIyYWKFPc7krYiCrKjA2nYkPwl09ehdZFzbd8f1mt3xRxlOEETuEcfLiCOtxBA5rAIIJneIU3Z+y8OO/Ox6K15BQzx/BHzucPm4aNZQ==</latexit><latexit sha1_base64="tFROaEGhmnnMpUlcEPFjR2z/BNI=">AAAB6XicbZBNS8NAEIYn9avWr6hHL4tF8FQSEfRY9OKxiv2ANpTNdtIu3WzC7kYoof/AiwdFvPqPvPlv3LY5aOsLCw/vzLAzb5gKro3nfTultfWNza3ydmVnd2//wD08aukkUwybLBGJ6oRUo+ASm4YbgZ1UIY1Dge1wfDurt59QaZ7IRzNJMYjpUPKIM2qs9dDL+27Vq3lzkVXwC6hCoUbf/eoNEpbFKA0TVOuu76UmyKkynAmcVnqZxpSyMR1i16KkMeogn286JWfWGZAoUfZJQ+bu74mcxlpP4tB2xtSM9HJtZv5X62Ymug5yLtPMoGSLj6JMEJOQ2dlkwBUyIyYWKFPc7krYiCrKjA2nYkPwl09ehdZFzbd8f1mt3xRxlOEETuEcfLiCOtxBA5rAIIJneIU3Z+y8OO/Ox6K15BQzx/BHzucPm4aNZQ==</latexit><latexit sha1_base64="tFROaEGhmnnMpUlcEPFjR2z/BNI=">AAAB6XicbZBNS8NAEIYn9avWr6hHL4tF8FQSEfRY9OKxiv2ANpTNdtIu3WzC7kYoof/AiwdFvPqPvPlv3LY5aOsLCw/vzLAzb5gKro3nfTultfWNza3ydmVnd2//wD08aukkUwybLBGJ6oRUo+ASm4YbgZ1UIY1Dge1wfDurt59QaZ7IRzNJMYjpUPKIM2qs9dDL+27Vq3lzkVXwC6hCoUbf/eoNEpbFKA0TVOuu76UmyKkynAmcVnqZxpSyMR1i16KkMeogn286JWfWGZAoUfZJQ+bu74mcxlpP4tB2xtSM9HJtZv5X62Ymug5yLtPMoGSLj6JMEJOQ2dlkwBUyIyYWKFPc7krYiCrKjA2nYkPwl09ehdZFzbd8f1mt3xRxlOEETuEcfLiCOtxBA5rAIIJneIU3Z+y8OO/Ox6K15BQzx/BHzucPm4aNZQ==</latexit>

a1<latexit sha1_base64="i5nKWL2osYHJ2SU7NvALg/zpMfE=">AAAB6nicbZBNS8NAEIYn9avWr6hHL4tF8FQSEfRY9OKxov2ANpTJdtMu3WzC7kYooT/BiwdFvPqLvPlv3LY5aOsLCw/vzLAzb5gKro3nfTultfWNza3ydmVnd2//wD08aukkU5Q1aSIS1QlRM8ElaxpuBOukimEcCtYOx7ezevuJKc0T+WgmKQtiHEoecYrGWg/Y9/tu1at5c5FV8AuoQqFG3/3qDRKaxUwaKlDrru+lJshRGU4Fm1Z6mWYp0jEOWdeixJjpIJ+vOiVn1hmQKFH2SUPm7u+JHGOtJ3FoO2M0I71cm5n/1bqZia6DnMs0M0zSxUdRJohJyOxuMuCKUSMmFpAqbncldIQKqbHpVGwI/vLJq9C6qPmW7y+r9ZsijjKcwCmcgw9XUIc7aEATKAzhGV7hzRHOi/PufCxaS04xcwx/5Hz+AOiZjYk=</latexit><latexit sha1_base64="i5nKWL2osYHJ2SU7NvALg/zpMfE=">AAAB6nicbZBNS8NAEIYn9avWr6hHL4tF8FQSEfRY9OKxov2ANpTJdtMu3WzC7kYooT/BiwdFvPqLvPlv3LY5aOsLCw/vzLAzb5gKro3nfTultfWNza3ydmVnd2//wD08aukkU5Q1aSIS1QlRM8ElaxpuBOukimEcCtYOx7ezevuJKc0T+WgmKQtiHEoecYrGWg/Y9/tu1at5c5FV8AuoQqFG3/3qDRKaxUwaKlDrru+lJshRGU4Fm1Z6mWYp0jEOWdeixJjpIJ+vOiVn1hmQKFH2SUPm7u+JHGOtJ3FoO2M0I71cm5n/1bqZia6DnMs0M0zSxUdRJohJyOxuMuCKUSMmFpAqbncldIQKqbHpVGwI/vLJq9C6qPmW7y+r9ZsijjKcwCmcgw9XUIc7aEATKAzhGV7hzRHOi/PufCxaS04xcwx/5Hz+AOiZjYk=</latexit><latexit sha1_base64="i5nKWL2osYHJ2SU7NvALg/zpMfE=">AAAB6nicbZBNS8NAEIYn9avWr6hHL4tF8FQSEfRY9OKxov2ANpTJdtMu3WzC7kYooT/BiwdFvPqLvPlv3LY5aOsLCw/vzLAzb5gKro3nfTultfWNza3ydmVnd2//wD08aukkU5Q1aSIS1QlRM8ElaxpuBOukimEcCtYOx7ezevuJKc0T+WgmKQtiHEoecYrGWg/Y9/tu1at5c5FV8AuoQqFG3/3qDRKaxUwaKlDrru+lJshRGU4Fm1Z6mWYp0jEOWdeixJjpIJ+vOiVn1hmQKFH2SUPm7u+JHGOtJ3FoO2M0I71cm5n/1bqZia6DnMs0M0zSxUdRJohJyOxuMuCKUSMmFpAqbncldIQKqbHpVGwI/vLJq9C6qPmW7y+r9ZsijjKcwCmcgw9XUIc7aEATKAzhGV7hzRHOi/PufCxaS04xcwx/5Hz+AOiZjYk=</latexit><latexit sha1_base64="i5nKWL2osYHJ2SU7NvALg/zpMfE=">AAAB6nicbZBNS8NAEIYn9avWr6hHL4tF8FQSEfRY9OKxov2ANpTJdtMu3WzC7kYooT/BiwdFvPqLvPlv3LY5aOsLCw/vzLAzb5gKro3nfTultfWNza3ydmVnd2//wD08aukkU5Q1aSIS1QlRM8ElaxpuBOukimEcCtYOx7ezevuJKc0T+WgmKQtiHEoecYrGWg/Y9/tu1at5c5FV8AuoQqFG3/3qDRKaxUwaKlDrru+lJshRGU4Fm1Z6mWYp0jEOWdeixJjpIJ+vOiVn1hmQKFH2SUPm7u+JHGOtJ3FoO2M0I71cm5n/1bqZia6DnMs0M0zSxUdRJohJyOxuMuCKUSMmFpAqbncldIQKqbHpVGwI/vLJq9C6qPmW7y+r9ZsijjKcwCmcgw9XUIc7aEATKAzhGV7hzRHOi/PufCxaS04xcwx/5Hz+AOiZjYk=</latexit>

a2<latexit sha1_base64="fHY8L0Lq6jayN2ybi5nRkuRgM8k=">AAAB6nicbZBNS8NAEIYn9avWr6pHL4tF8FSSItRj0YvHivYD2lAm2027dLMJuxuhhP4ELx4U8eov8ua/cdvmoK0vLDy8M8POvEEiuDau++0UNja3tneKu6W9/YPDo/LxSVvHqaKsRWMRq26AmgkuWctwI1g3UQyjQLBOMLmd1ztPTGkey0czTZgf4UjykFM01nrAQW1QrrhVdyGyDl4OFcjVHJS/+sOYphGThgrUuue5ifEzVIZTwWalfqpZgnSCI9azKDFi2s8Wq87IhXWGJIyVfdKQhft7IsNI62kU2M4IzViv1ubmf7VeasJrP+MySQ2TdPlRmApiYjK/mwy5YtSIqQWkittdCR2jQmpsOiUbgrd68jq0a1XP8v1VpXGTx1GEMziHS/CgDg24gya0gMIInuEV3hzhvDjvzseyteDkM6fwR87nD+odjYo=</latexit><latexit sha1_base64="fHY8L0Lq6jayN2ybi5nRkuRgM8k=">AAAB6nicbZBNS8NAEIYn9avWr6pHL4tF8FSSItRj0YvHivYD2lAm2027dLMJuxuhhP4ELx4U8eov8ua/cdvmoK0vLDy8M8POvEEiuDau++0UNja3tneKu6W9/YPDo/LxSVvHqaKsRWMRq26AmgkuWctwI1g3UQyjQLBOMLmd1ztPTGkey0czTZgf4UjykFM01nrAQW1QrrhVdyGyDl4OFcjVHJS/+sOYphGThgrUuue5ifEzVIZTwWalfqpZgnSCI9azKDFi2s8Wq87IhXWGJIyVfdKQhft7IsNI62kU2M4IzViv1ubmf7VeasJrP+MySQ2TdPlRmApiYjK/mwy5YtSIqQWkittdCR2jQmpsOiUbgrd68jq0a1XP8v1VpXGTx1GEMziHS/CgDg24gya0gMIInuEV3hzhvDjvzseyteDkM6fwR87nD+odjYo=</latexit><latexit sha1_base64="fHY8L0Lq6jayN2ybi5nRkuRgM8k=">AAAB6nicbZBNS8NAEIYn9avWr6pHL4tF8FSSItRj0YvHivYD2lAm2027dLMJuxuhhP4ELx4U8eov8ua/cdvmoK0vLDy8M8POvEEiuDau++0UNja3tneKu6W9/YPDo/LxSVvHqaKsRWMRq26AmgkuWctwI1g3UQyjQLBOMLmd1ztPTGkey0czTZgf4UjykFM01nrAQW1QrrhVdyGyDl4OFcjVHJS/+sOYphGThgrUuue5ifEzVIZTwWalfqpZgnSCI9azKDFi2s8Wq87IhXWGJIyVfdKQhft7IsNI62kU2M4IzViv1ubmf7VeasJrP+MySQ2TdPlRmApiYjK/mwy5YtSIqQWkittdCR2jQmpsOiUbgrd68jq0a1XP8v1VpXGTx1GEMziHS/CgDg24gya0gMIInuEV3hzhvDjvzseyteDkM6fwR87nD+odjYo=</latexit><latexit sha1_base64="ck8pdC+ekZH4nUmSP+ZG7r8lEyk=">AAAB2XicbZDNSgMxFIXv1L86Vq1rN8EiuCozbnQpuHFZwbZCO5RM5k4bmskMyR2hDH0BF25EfC93vo3pz0JbDwQ+zknIvSculLQUBN9ebWd3b/+gfugfNfzjk9Nmo2fz0gjsilzl5jnmFpXU2CVJCp8LgzyLFfbj6f0i77+gsTLXTzQrMMr4WMtUCk7O6oyaraAdLMW2IVxDC9YaNb+GSS7KDDUJxa0dhEFBUcUNSaFw7g9LiwUXUz7GgUPNM7RRtRxzzi6dk7A0N+5oYkv394uKZ9bOstjdzDhN7Ga2MP/LBiWlt1EldVESarH6KC0Vo5wtdmaJNChIzRxwYaSblYkJN1yQa8Z3HYSbG29D77odOn4MoA7ncAFXEMIN3MEDdKALAhJ4hXdv4r15H6uuat66tDP4I+/zBzjGijg=</latexit><latexit sha1_base64="5EnDFfMDBOio/2eay9En10Yo+dY=">AAAB33icbZBLSwMxFIXv+Ky1anXrJlgEV2WmG10KblxWtA9oh5JJ77ShmcyQ3BHK0J/gxoUi/it3/hvTx0JbDwQ+zknIvSfKlLTk+9/e1vbO7t5+6aB8WDk6PqmeVto2zY3AlkhVaroRt6ikxhZJUtjNDPIkUtiJJnfzvPOMxspUP9E0wzDhIy1jKTg565EPGoNqza/7C7FNCFZQg5Wag+pXf5iKPEFNQnFre4GfUVhwQ1IonJX7ucWMiwkfYc+h5gnasFiMOmOXzhmyODXuaGIL9/eLgifWTpPI3Uw4je16Njf/y3o5xTdhIXWWE2qx/CjOFaOUzfdmQ2lQkJo64MJINysTY264INdO2ZUQrK+8Ce1GPXD84EMJzuECriCAa7iFe2hCCwSM4AXe4N1T3qv3saxry1v1dgZ/5H3+AM+5jDU=</latexit><latexit sha1_base64="5EnDFfMDBOio/2eay9En10Yo+dY=">AAAB33icbZBLSwMxFIXv+Ky1anXrJlgEV2WmG10KblxWtA9oh5JJ77ShmcyQ3BHK0J/gxoUi/it3/hvTx0JbDwQ+zknIvSfKlLTk+9/e1vbO7t5+6aB8WDk6PqmeVto2zY3AlkhVaroRt6ikxhZJUtjNDPIkUtiJJnfzvPOMxspUP9E0wzDhIy1jKTg565EPGoNqza/7C7FNCFZQg5Wag+pXf5iKPEFNQnFre4GfUVhwQ1IonJX7ucWMiwkfYc+h5gnasFiMOmOXzhmyODXuaGIL9/eLgifWTpPI3Uw4je16Njf/y3o5xTdhIXWWE2qx/CjOFaOUzfdmQ2lQkJo64MJINysTY264INdO2ZUQrK+8Ce1GPXD84EMJzuECriCAa7iFe2hCCwSM4AXe4N1T3qv3saxry1v1dgZ/5H3+AM+5jDU=</latexit><latexit sha1_base64="p8tLOQbHVpAOVm/uqaDbqhYE8i4=">AAAB6nicbZBNS8NAEIYn9avWr6hHL4tF8FSSXvRY9OKxov2ANpTJdtMu3WzC7kYooT/BiwdFvPqLvPlv3LY5aOsLCw/vzLAzb5gKro3nfTuljc2t7Z3ybmVv/+DwyD0+aeskU5S1aCIS1Q1RM8ElaxluBOumimEcCtYJJ7fzeueJKc0T+WimKQtiHEkecYrGWg84qA/cqlfzFiLr4BdQhULNgfvVHyY0i5k0VKDWPd9LTZCjMpwKNqv0M81SpBMcsZ5FiTHTQb5YdUYurDMkUaLsk4Ys3N8TOcZaT+PQdsZoxnq1Njf/q/UyE10HOZdpZpiky4+iTBCTkPndZMgVo0ZMLSBV3O5K6BgVUmPTqdgQ/NWT16Fdr/mW771q46aIowxncA6X4MMVNOAOmtACCiN4hld4c4Tz4rw7H8vWklPMnMIfOZ8/6N2Nhg==</latexit><latexit sha1_base64="fHY8L0Lq6jayN2ybi5nRkuRgM8k=">AAAB6nicbZBNS8NAEIYn9avWr6pHL4tF8FSSItRj0YvHivYD2lAm2027dLMJuxuhhP4ELx4U8eov8ua/cdvmoK0vLDy8M8POvEEiuDau++0UNja3tneKu6W9/YPDo/LxSVvHqaKsRWMRq26AmgkuWctwI1g3UQyjQLBOMLmd1ztPTGkey0czTZgf4UjykFM01nrAQW1QrrhVdyGyDl4OFcjVHJS/+sOYphGThgrUuue5ifEzVIZTwWalfqpZgnSCI9azKDFi2s8Wq87IhXWGJIyVfdKQhft7IsNI62kU2M4IzViv1ubmf7VeasJrP+MySQ2TdPlRmApiYjK/mwy5YtSIqQWkittdCR2jQmpsOiUbgrd68jq0a1XP8v1VpXGTx1GEMziHS/CgDg24gya0gMIInuEV3hzhvDjvzseyteDkM6fwR87nD+odjYo=</latexit><latexit sha1_base64="fHY8L0Lq6jayN2ybi5nRkuRgM8k=">AAAB6nicbZBNS8NAEIYn9avWr6pHL4tF8FSSItRj0YvHivYD2lAm2027dLMJuxuhhP4ELx4U8eov8ua/cdvmoK0vLDy8M8POvEEiuDau++0UNja3tneKu6W9/YPDo/LxSVvHqaKsRWMRq26AmgkuWctwI1g3UQyjQLBOMLmd1ztPTGkey0czTZgf4UjykFM01nrAQW1QrrhVdyGyDl4OFcjVHJS/+sOYphGThgrUuue5ifEzVIZTwWalfqpZgnSCI9azKDFi2s8Wq87IhXWGJIyVfdKQhft7IsNI62kU2M4IzViv1ubmf7VeasJrP+MySQ2TdPlRmApiYjK/mwy5YtSIqQWkittdCR2jQmpsOiUbgrd68jq0a1XP8v1VpXGTx1GEMziHS/CgDg24gya0gMIInuEV3hzhvDjvzseyteDkM6fwR87nD+odjYo=</latexit><latexit sha1_base64="fHY8L0Lq6jayN2ybi5nRkuRgM8k=">AAAB6nicbZBNS8NAEIYn9avWr6pHL4tF8FSSItRj0YvHivYD2lAm2027dLMJuxuhhP4ELx4U8eov8ua/cdvmoK0vLDy8M8POvEEiuDau++0UNja3tneKu6W9/YPDo/LxSVvHqaKsRWMRq26AmgkuWctwI1g3UQyjQLBOMLmd1ztPTGkey0czTZgf4UjykFM01nrAQW1QrrhVdyGyDl4OFcjVHJS/+sOYphGThgrUuue5ifEzVIZTwWalfqpZgnSCI9azKDFi2s8Wq87IhXWGJIyVfdKQhft7IsNI62kU2M4IzViv1ubmf7VeasJrP+MySQ2TdPlRmApiYjK/mwy5YtSIqQWkittdCR2jQmpsOiUbgrd68jq0a1XP8v1VpXGTx1GEMziHS/CgDg24gya0gMIInuEV3hzhvDjvzseyteDkM6fwR87nD+odjYo=</latexit><latexit sha1_base64="fHY8L0Lq6jayN2ybi5nRkuRgM8k=">AAAB6nicbZBNS8NAEIYn9avWr6pHL4tF8FSSItRj0YvHivYD2lAm2027dLMJuxuhhP4ELx4U8eov8ua/cdvmoK0vLDy8M8POvEEiuDau++0UNja3tneKu6W9/YPDo/LxSVvHqaKsRWMRq26AmgkuWctwI1g3UQyjQLBOMLmd1ztPTGkey0czTZgf4UjykFM01nrAQW1QrrhVdyGyDl4OFcjVHJS/+sOYphGThgrUuue5ifEzVIZTwWalfqpZgnSCI9azKDFi2s8Wq87IhXWGJIyVfdKQhft7IsNI62kU2M4IzViv1ubmf7VeasJrP+MySQ2TdPlRmApiYjK/mwy5YtSIqQWkittdCR2jQmpsOiUbgrd68jq0a1XP8v1VpXGTx1GEMziHS/CgDg24gya0gMIInuEV3hzhvDjvzseyteDkM6fwR87nD+odjYo=</latexit><latexit sha1_base64="fHY8L0Lq6jayN2ybi5nRkuRgM8k=">AAAB6nicbZBNS8NAEIYn9avWr6pHL4tF8FSSItRj0YvHivYD2lAm2027dLMJuxuhhP4ELx4U8eov8ua/cdvmoK0vLDy8M8POvEEiuDau++0UNja3tneKu6W9/YPDo/LxSVvHqaKsRWMRq26AmgkuWctwI1g3UQyjQLBOMLmd1ztPTGkey0czTZgf4UjykFM01nrAQW1QrrhVdyGyDl4OFcjVHJS/+sOYphGThgrUuue5ifEzVIZTwWalfqpZgnSCI9azKDFi2s8Wq87IhXWGJIyVfdKQhft7IsNI62kU2M4IzViv1ubmf7VeasJrP+MySQ2TdPlRmApiYjK/mwy5YtSIqQWkittdCR2jQmpsOiUbgrd68jq0a1XP8v1VpXGTx1GEMziHS/CgDg24gya0gMIInuEV3hzhvDjvzseyteDkM6fwR87nD+odjYo=</latexit><latexit sha1_base64="fHY8L0Lq6jayN2ybi5nRkuRgM8k=">AAAB6nicbZBNS8NAEIYn9avWr6pHL4tF8FSSItRj0YvHivYD2lAm2027dLMJuxuhhP4ELx4U8eov8ua/cdvmoK0vLDy8M8POvEEiuDau++0UNja3tneKu6W9/YPDo/LxSVvHqaKsRWMRq26AmgkuWctwI1g3UQyjQLBOMLmd1ztPTGkey0czTZgf4UjykFM01nrAQW1QrrhVdyGyDl4OFcjVHJS/+sOYphGThgrUuue5ifEzVIZTwWalfqpZgnSCI9azKDFi2s8Wq87IhXWGJIyVfdKQhft7IsNI62kU2M4IzViv1ubmf7VeasJrP+MySQ2TdPlRmApiYjK/mwy5YtSIqQWkittdCR2jQmpsOiUbgrd68jq0a1XP8v1VpXGTx1GEMziHS/CgDg24gya0gMIInuEV3hzhvDjvzseyteDkM6fwR87nD+odjYo=</latexit>

am<latexit sha1_base64="uGkhqy/YuhTKr1EIG2B45HDVWx0=">AAAB6nicbZBNSwMxEIZn61etX1WPXoJF8FR2RdBj0YvHivYD2qXMptk2NMkuSVYopT/BiwdFvPqLvPlvTNs9aOsLgYd3ZsjMG6WCG+v7315hbX1jc6u4XdrZ3ds/KB8eNU2SacoaNBGJbkdomOCKNSy3grVTzVBGgrWi0e2s3npi2vBEPdpxykKJA8VjTtE66wF7sleu+FV/LrIKQQ4VyFXvlb+6/YRmkilLBRrTCfzUhhPUllPBpqVuZliKdIQD1nGoUDITTuarTsmZc/okTrR7ypK5+3tigtKYsYxcp0Q7NMu1mflfrZPZ+DqccJVmlim6+CjOBLEJmd1N+lwzasXYAVLN3a6EDlEjtS6dkgshWD55FZoX1cDx/WWldpPHUYQTOIVzCOAKanAHdWgAhQE8wyu8ecJ78d69j0VrwctnjuGPvM8fQ5iNxQ==</latexit><latexit sha1_base64="uGkhqy/YuhTKr1EIG2B45HDVWx0=">AAAB6nicbZBNSwMxEIZn61etX1WPXoJF8FR2RdBj0YvHivYD2qXMptk2NMkuSVYopT/BiwdFvPqLvPlvTNs9aOsLgYd3ZsjMG6WCG+v7315hbX1jc6u4XdrZ3ds/KB8eNU2SacoaNBGJbkdomOCKNSy3grVTzVBGgrWi0e2s3npi2vBEPdpxykKJA8VjTtE66wF7sleu+FV/LrIKQQ4VyFXvlb+6/YRmkilLBRrTCfzUhhPUllPBpqVuZliKdIQD1nGoUDITTuarTsmZc/okTrR7ypK5+3tigtKYsYxcp0Q7NMu1mflfrZPZ+DqccJVmlim6+CjOBLEJmd1N+lwzasXYAVLN3a6EDlEjtS6dkgshWD55FZoX1cDx/WWldpPHUYQTOIVzCOAKanAHdWgAhQE8wyu8ecJ78d69j0VrwctnjuGPvM8fQ5iNxQ==</latexit><latexit sha1_base64="uGkhqy/YuhTKr1EIG2B45HDVWx0=">AAAB6nicbZBNSwMxEIZn61etX1WPXoJF8FR2RdBj0YvHivYD2qXMptk2NMkuSVYopT/BiwdFvPqLvPlvTNs9aOsLgYd3ZsjMG6WCG+v7315hbX1jc6u4XdrZ3ds/KB8eNU2SacoaNBGJbkdomOCKNSy3grVTzVBGgrWi0e2s3npi2vBEPdpxykKJA8VjTtE66wF7sleu+FV/LrIKQQ4VyFXvlb+6/YRmkilLBRrTCfzUhhPUllPBpqVuZliKdIQD1nGoUDITTuarTsmZc/okTrR7ypK5+3tigtKYsYxcp0Q7NMu1mflfrZPZ+DqccJVmlim6+CjOBLEJmd1N+lwzasXYAVLN3a6EDlEjtS6dkgshWD55FZoX1cDx/WWldpPHUYQTOIVzCOAKanAHdWgAhQE8wyu8ecJ78d69j0VrwctnjuGPvM8fQ5iNxQ==</latexit><latexit sha1_base64="uGkhqy/YuhTKr1EIG2B45HDVWx0=">AAAB6nicbZBNSwMxEIZn61etX1WPXoJF8FR2RdBj0YvHivYD2qXMptk2NMkuSVYopT/BiwdFvPqLvPlvTNs9aOsLgYd3ZsjMG6WCG+v7315hbX1jc6u4XdrZ3ds/KB8eNU2SacoaNBGJbkdomOCKNSy3grVTzVBGgrWi0e2s3npi2vBEPdpxykKJA8VjTtE66wF7sleu+FV/LrIKQQ4VyFXvlb+6/YRmkilLBRrTCfzUhhPUllPBpqVuZliKdIQD1nGoUDITTuarTsmZc/okTrR7ypK5+3tigtKYsYxcp0Q7NMu1mflfrZPZ+DqccJVmlim6+CjOBLEJmd1N+lwzasXYAVLN3a6EDlEjtS6dkgshWD55FZoX1cDx/WWldpPHUYQTOIVzCOAKanAHdWgAhQE8wyu8ecJ78d69j0VrwctnjuGPvM8fQ5iNxQ==</latexit>

Age label[30]

<latexit sha1_base64="M7vmw7VbJIUf0IZQmzGANVHtQBA=">AAAB63icbZDLSsNAFIZP6q3WW9Wlm8EiuCqJCrosunFZwV4gDWUynbRDZyZhZiKU0Fdw40IRt76QO9/GSZqFtv4w8PGfc5hz/jDhTBvX/XYqa+sbm1vV7drO7t7+Qf3wqKvjVBHaITGPVT/EmnImaccww2k/URSLkNNeOL3L670nqjSL5aOZJTQQeCxZxAg2ueVfusGw3nCbbiG0Cl4JDSjVHta/BqOYpIJKQzjW2vfcxAQZVoYRTue1QappgskUj6lvUWJBdZAVu87RmXVGKIqVfdKgwv09kWGh9UyEtlNgM9HLtdz8r+anJroJMiaT1FBJFh9FKUcmRvnhaMQUJYbPLGCimN0VkQlWmBgbT82G4C2fvArdi6Zn+eGq0bot46jCCZzCOXhwDS24hzZ0gMAEnuEV3hzhvDjvzseiteKUM8fwR87nD02+jb0=</latexit><latexit sha1_base64="M7vmw7VbJIUf0IZQmzGANVHtQBA=">AAAB63icbZDLSsNAFIZP6q3WW9Wlm8EiuCqJCrosunFZwV4gDWUynbRDZyZhZiKU0Fdw40IRt76QO9/GSZqFtv4w8PGfc5hz/jDhTBvX/XYqa+sbm1vV7drO7t7+Qf3wqKvjVBHaITGPVT/EmnImaccww2k/URSLkNNeOL3L670nqjSL5aOZJTQQeCxZxAg2ueVfusGw3nCbbiG0Cl4JDSjVHta/BqOYpIJKQzjW2vfcxAQZVoYRTue1QappgskUj6lvUWJBdZAVu87RmXVGKIqVfdKgwv09kWGh9UyEtlNgM9HLtdz8r+anJroJMiaT1FBJFh9FKUcmRvnhaMQUJYbPLGCimN0VkQlWmBgbT82G4C2fvArdi6Zn+eGq0bot46jCCZzCOXhwDS24hzZ0gMAEnuEV3hzhvDjvzseiteKUM8fwR87nD02+jb0=</latexit><latexit sha1_base64="M7vmw7VbJIUf0IZQmzGANVHtQBA=">AAAB63icbZDLSsNAFIZP6q3WW9Wlm8EiuCqJCrosunFZwV4gDWUynbRDZyZhZiKU0Fdw40IRt76QO9/GSZqFtv4w8PGfc5hz/jDhTBvX/XYqa+sbm1vV7drO7t7+Qf3wqKvjVBHaITGPVT/EmnImaccww2k/URSLkNNeOL3L670nqjSL5aOZJTQQeCxZxAg2ueVfusGw3nCbbiG0Cl4JDSjVHta/BqOYpIJKQzjW2vfcxAQZVoYRTue1QappgskUj6lvUWJBdZAVu87RmXVGKIqVfdKgwv09kWGh9UyEtlNgM9HLtdz8r+anJroJMiaT1FBJFh9FKUcmRvnhaMQUJYbPLGCimN0VkQlWmBgbT82G4C2fvArdi6Zn+eGq0bot46jCCZzCOXhwDS24hzZ0gMAEnuEV3hzhvDjvzseiteKUM8fwR87nD02+jb0=</latexit><latexit sha1_base64="M7vmw7VbJIUf0IZQmzGANVHtQBA=">AAAB63icbZDLSsNAFIZP6q3WW9Wlm8EiuCqJCrosunFZwV4gDWUynbRDZyZhZiKU0Fdw40IRt76QO9/GSZqFtv4w8PGfc5hz/jDhTBvX/XYqa+sbm1vV7drO7t7+Qf3wqKvjVBHaITGPVT/EmnImaccww2k/URSLkNNeOL3L670nqjSL5aOZJTQQeCxZxAg2ueVfusGw3nCbbiG0Cl4JDSjVHta/BqOYpIJKQzjW2vfcxAQZVoYRTue1QappgskUj6lvUWJBdZAVu87RmXVGKIqVfdKgwv09kWGh9UyEtlNgM9HLtdz8r+anJroJMiaT1FBJFh9FKUcmRvnhaMQUJYbPLGCimN0VkQlWmBgbT82G4C2fvArdi6Zn+eGq0bot46jCCZzCOXhwDS24hzZ0gMAEnuEV3hzhvDjvzseiteKUM8fwR87nD02+jb0=</latexit>

2666664

11...00

37777752 ZK�1

2

<latexit sha1_base64="/XCD8nf7cWvcS8D4qcb5KAZy6ww=">AAACNXicbZBNSwMxEIazflu/qh69BIvgRdkVQY9FL4IeFKwtNrUk2WkNZrNLkhXLsn/Ki//Dkx48KOLVv2C2LaLWgYEn78yQmZclUhjr+8/e2PjE5NT0zGxpbn5hcam8vHJh4lRzqPFYxrrBqAEpFNSssBIaiQYaMQl1dnNY1Ou3oI2I1bntJdCKaFeJjuDUOqldPiEMukJlLKJWi7u8FBDST3IbxtY48IvEBFT43YSJUJi4xzVj2WXe3rnKjreCvNQuV/xtvx94FIIhVNAwTtvlRxLGPI1AWS6pMc3AT2wro9oKLiEvkdRAQvkN7ULToaIRmFbWvzrHG04JcSfWLpXFffXnREYjY3oRc53FquZvrRD/qzVT29lvZUIlqQXFBx91UoltjAsLcSg0cCt7DijXwu2K+TXVlFtndGFC8PfkUbjY2Q4cn+1WqgdDO2bQGlpHmyhAe6iKjtApqiGO7tETekVv3oP34r17H4PWMW84s4p+hff5BaWDqh4=</latexit><latexit sha1_base64="/XCD8nf7cWvcS8D4qcb5KAZy6ww=">AAACNXicbZBNSwMxEIazflu/qh69BIvgRdkVQY9FL4IeFKwtNrUk2WkNZrNLkhXLsn/Ki//Dkx48KOLVv2C2LaLWgYEn78yQmZclUhjr+8/e2PjE5NT0zGxpbn5hcam8vHJh4lRzqPFYxrrBqAEpFNSssBIaiQYaMQl1dnNY1Ou3oI2I1bntJdCKaFeJjuDUOqldPiEMukJlLKJWi7u8FBDST3IbxtY48IvEBFT43YSJUJi4xzVj2WXe3rnKjreCvNQuV/xtvx94FIIhVNAwTtvlRxLGPI1AWS6pMc3AT2wro9oKLiEvkdRAQvkN7ULToaIRmFbWvzrHG04JcSfWLpXFffXnREYjY3oRc53FquZvrRD/qzVT29lvZUIlqQXFBx91UoltjAsLcSg0cCt7DijXwu2K+TXVlFtndGFC8PfkUbjY2Q4cn+1WqgdDO2bQGlpHmyhAe6iKjtApqiGO7tETekVv3oP34r17H4PWMW84s4p+hff5BaWDqh4=</latexit><latexit sha1_base64="/XCD8nf7cWvcS8D4qcb5KAZy6ww=">AAACNXicbZBNSwMxEIazflu/qh69BIvgRdkVQY9FL4IeFKwtNrUk2WkNZrNLkhXLsn/Ki//Dkx48KOLVv2C2LaLWgYEn78yQmZclUhjr+8/e2PjE5NT0zGxpbn5hcam8vHJh4lRzqPFYxrrBqAEpFNSssBIaiQYaMQl1dnNY1Ou3oI2I1bntJdCKaFeJjuDUOqldPiEMukJlLKJWi7u8FBDST3IbxtY48IvEBFT43YSJUJi4xzVj2WXe3rnKjreCvNQuV/xtvx94FIIhVNAwTtvlRxLGPI1AWS6pMc3AT2wro9oKLiEvkdRAQvkN7ULToaIRmFbWvzrHG04JcSfWLpXFffXnREYjY3oRc53FquZvrRD/qzVT29lvZUIlqQXFBx91UoltjAsLcSg0cCt7DijXwu2K+TXVlFtndGFC8PfkUbjY2Q4cn+1WqgdDO2bQGlpHmyhAe6iKjtApqiGO7tETekVv3oP34r17H4PWMW84s4p+hff5BaWDqh4=</latexit><latexit sha1_base64="/XCD8nf7cWvcS8D4qcb5KAZy6ww=">AAACNXicbZBNSwMxEIazflu/qh69BIvgRdkVQY9FL4IeFKwtNrUk2WkNZrNLkhXLsn/Ki//Dkx48KOLVv2C2LaLWgYEn78yQmZclUhjr+8/e2PjE5NT0zGxpbn5hcam8vHJh4lRzqPFYxrrBqAEpFNSssBIaiQYaMQl1dnNY1Ou3oI2I1bntJdCKaFeJjuDUOqldPiEMukJlLKJWi7u8FBDST3IbxtY48IvEBFT43YSJUJi4xzVj2WXe3rnKjreCvNQuV/xtvx94FIIhVNAwTtvlRxLGPI1AWS6pMc3AT2wro9oKLiEvkdRAQvkN7ULToaIRmFbWvzrHG04JcSfWLpXFffXnREYjY3oRc53FquZvrRD/qzVT29lvZUIlqQXFBx91UoltjAsLcSg0cCt7DijXwu2K+TXVlFtndGFC8PfkUbjY2Q4cn+1WqgdDO2bQGlpHmyhAe6iKjtApqiGO7tETekVv3oP34r17H4PWMW84s4p+hff5BaWDqh4=</latexit>

Extended label

{<latexit sha1_base64="tFROaEGhmnnMpUlcEPFjR2z/BNI=">AAAB6XicbZBNS8NAEIYn9avWr6hHL4tF8FQSEfRY9OKxiv2ANpTNdtIu3WzC7kYoof/AiwdFvPqPvPlv3LY5aOsLCw/vzLAzb5gKro3nfTultfWNza3ydmVnd2//wD08aukkUwybLBGJ6oRUo+ASm4YbgZ1UIY1Dge1wfDurt59QaZ7IRzNJMYjpUPKIM2qs9dDL+27Vq3lzkVXwC6hCoUbf/eoNEpbFKA0TVOuu76UmyKkynAmcVnqZxpSyMR1i16KkMeogn286JWfWGZAoUfZJQ+bu74mcxlpP4tB2xtSM9HJtZv5X62Ymug5yLtPMoGSLj6JMEJOQ2dlkwBUyIyYWKFPc7krYiCrKjA2nYkPwl09ehdZFzbd8f1mt3xRxlOEETuEcfLiCOtxBA5rAIIJneIU3Z+y8OO/Ox6K15BQzx/BHzucPm4aNZQ==</latexit> <latexit sha1_base64="tFROaEGhmnnMpUlcEPFjR2z/BNI=">AAAB6XicbZBNS8NAEIYn9avWr6hHL4tF8FQSEfRY9OKxiv2ANpTNdtIu3WzC7kYoof/AiwdFvPqPvPlv3LY5aOsLCw/vzLAzb5gKro3nfTultfWNza3ydmVnd2//wD08aukkUwybLBGJ6oRUo+ASm4YbgZ1UIY1Dge1wfDurt59QaZ7IRzNJMYjpUPKIM2qs9dDL+27Vq3lzkVXwC6hCoUbf/eoNEpbFKA0TVOuu76UmyKkynAmcVnqZxpSyMR1i16KkMeogn286JWfWGZAoUfZJQ+bu74mcxlpP4tB2xtSM9HJtZv5X62Ymug5yLtPMoGSLj6JMEJOQ2dlkwBUyIyYWKFPc7krYiCrKjA2nYkPwl09ehdZFzbd8f1mt3xRxlOEETuEcfLiCOtxBA5rAIIJneIU3Z+y8OO/Ox6K15BQzx/BHzucPm4aNZQ==</latexit> <latexit sha1_base64="tFROaEGhmnnMpUlcEPFjR2z/BNI=">AAAB6XicbZBNS8NAEIYn9avWr6hHL4tF8FQSEfRY9OKxiv2ANpTNdtIu3WzC7kYoof/AiwdFvPqPvPlv3LY5aOsLCw/vzLAzb5gKro3nfTultfWNza3ydmVnd2//wD08aukkUwybLBGJ6oRUo+ASm4YbgZ1UIY1Dge1wfDurt59QaZ7IRzNJMYjpUPKIM2qs9dDL+27Vq3lzkVXwC6hCoUbf/eoNEpbFKA0TVOuu76UmyKkynAmcVnqZxpSyMR1i16KkMeogn286JWfWGZAoUfZJQ+bu74mcxlpP4tB2xtSM9HJtZv5X62Ymug5yLtPMoGSLj6JMEJOQ2dlkwBUyIyYWKFPc7krYiCrKjA2nYkPwl09ehdZFzbd8f1mt3xRxlOEETuEcfLiCOtxBA5rAIIJneIU3Z+y8OO/Ox6K15BQzx/BHzucPm4aNZQ==</latexit> <latexit sha1_base64="tFROaEGhmnnMpUlcEPFjR2z/BNI=">AAAB6XicbZBNS8NAEIYn9avWr6hHL4tF8FQSEfRY9OKxiv2ANpTNdtIu3WzC7kYoof/AiwdFvPqPvPlv3LY5aOsLCw/vzLAzb5gKro3nfTultfWNza3ydmVnd2//wD08aukkUwybLBGJ6oRUo+ASm4YbgZ1UIY1Dge1wfDurt59QaZ7IRzNJMYjpUPKIM2qs9dDL+27Vq3lzkVXwC6hCoUbf/eoNEpbFKA0TVOuu76UmyKkynAmcVnqZxpSyMR1i16KkMeogn286JWfWGZAoUfZJQ+bu74mcxlpP4tB2xtSM9HJtZv5X62Ymug5yLtPMoGSLj6JMEJOQ2dlkwBUyIyYWKFPc7krYiCrKjA2nYkPwl09ehdZFzbd8f1mt3xRxlOEETuEcfLiCOtxBA5rAIIJneIU3Z+y8OO/Ox6K15BQzx/BHzucPm4aNZQ==</latexit>

Label extension during training

bP (yi > rk) = �

✓ mX

j

wjaj + bk

<latexit sha1_base64="/FPlCwGqRMTrb209MNF9jOzJHPI=">AAACKnicbVBNS8NAEN34bf2qevSyWIQWoSRS0YtS9eKxglWhiWGy2abb7iZhd6OU0N/jxb/ipQdFvPpD3NYetPpg4PHeDDPzgpQzpW373ZqZnZtfWFxaLqysrq1vFDe3blSSSUKbJOGJvAtAUc5i2tRMc3qXSgoi4PQ26F2M/NsHKhVL4mvdT6knIIpZmxHQRvKLZ+4jC2kHdN4YlPs+w6dY+r0KPsGuYpEAN2BRVHZVJvzuvcCPfheDqX0c+L2xV/GLJbtqj4H/EmdCSmiChl8cumFCMkFjTTgo1XLsVHs5SM0Ip4OCmymaAulBRFuGxiCo8vLxqwO8Z5QQtxNpKtZ4rP6cyEEo1ReB6RSgO2raG4n/ea1Mt4+9nMVppmlMvhe1M451gke54ZBJSjTvGwJEMnMrJh2QQLRJt2BCcKZf/ktuDqpOrXp4VSvVzydxLKEdtIvKyEFHqI4uUQM1EUFP6AW9ojfr2Rpa79bHd+uMNZnZRr9gfX4B9qSlvg==</latexit>

Fig. 1. Illustration of the consistent rank logits CNN (CORAL-CNN) used for age prediction. From the estimated probability values, the binary labels areobtained via Eq. 4 and converted to the age label via Eq. 1.

prove the theorem for rank consistency among the binary clas-sification tasks that guarantee that the binary tasks produce con-sistently ranked predictions.

3.2.1. Label Extension and Rank PredictionGiven a training dataset D = {xi, yi}

Ni=1, a rank yi is

first extended into K − 1 binary labels y(1)i , . . . , y(K−1)

i suchthat y(k)

i ∈ {0, 1} indicates whether yi exceeds rank rk, i.e.,y(k)

i = 1{yi > rk}. The indicator function 1{·} is 1 if the innercondition is true and 0 otherwise. Providing the extended bi-nary labels as model inputs, we train a single CNN with K − 1binary classifiers in the output layer (Figure 1).

Based on the binary task responses, the predicted rank for aninput xi is obtained via

h(xi) = rq, q = 1 +

K−1∑k=1

fk(xi), (1)

where fk(xi) ∈ {0, 1} is the prediction of the k-th bi-nary classifier in the output layer. We require that { fk}K−1

k=1reflect the ordinal information and are rank-monotonic,f1(xi) ≥ f2(xi) ≥ . . . ≥ fK−1(xi), which guarantees consistentpredictions. To achieve rank-monotonicity and guarantee bi-nary classifier consistency (Theorem 1), the K − 1 binary tasksshare the same weight parameters but have independent biasunits (Figure 1).

3.2.2. Loss FunctionLet W denote the weight parameters of the neural net-

work excluding the bias units of the final layer. The penul-timate layer, whose output is denoted as g(xi,W), shares asingle weight with all nodes in the final output layer; K − 1independent bias units are then added to g(xi,W) such that

{g(xi,W) + bk}K−1k=1 are the inputs to the corresponding binary

classifiers in the final layer. Let σ(z) = 1/(1 + exp(−z)) be thelogistic sigmoid function. The predicted empirical probabilityfor task k is defined as

(2)P̂(y(k)i = 1) = σ(g(xi,W) + bk).

For model training, we minimize the loss function

(3)

L(W,b) =

N∑i=1

K−1∑k=1

λ(k)[ log(σ(g(xi,W) + bk))y(k)i

+ log(1 − σ(g(xi,W) + bk))(1 − y(k)i )],

which is the weighted cross-entropy of K − 1 binary classi-fiers. For rank prediction (Eq. 1), the binary labels are obtainedvia

fk(xi) = 1{P̂(y(k)i = 1) > 0.5}. (4)

In Eq. 3, λ(k) denotes the weight of the loss associated withthe k-th classifier (assuming λ(k) > 0). In the remainder of thepaper, we refer to λ(k) as the importance parameter for task k.Some tasks may be less robust or harder to optimize, whichcan be considered by choosing a non-uniform task weightingscheme. For simplicity, we carried out all experiments withuniform task weighting, that is, ∀k : λ(k) = 1. In the next sec-tion, we provide the theoretical guarantee for classifier consis-tency under uniform and non-uniform task importance weight-ing given that the task importance weights are positive numbers.

3.2.3. Theoretical Guarantees for Classifier ConsistencyThe following theorem shows that by minimizing the loss

L (Eq. 3), the learned bias units of the output layer are non-increasing such that b1 ≥ b2 ≥ . . . ≥ bK−1. Consequently, the

Page 4: Rank-consistent ordinal regression for neural networks · 1 Rank-consistent ordinal regression for neural networks WenzhiCaoa, VahidMirjalilib, SebastianRaschkaa, aUniversity of Wisconsin-Madison,

4

predicted confidence scores or probability estimates of the K−1tasks are decreasing, i.e.,

P̂(y(1)

i = 1)≥ P̂

(y(2)

i = 1)≥ . . . ≥ P̂

(y(K−1)

i = 1)

(5)

for all i, ensuring classifier consistency. Consequently, { fk}K−1k=1

(Eq. 4) are also rank-monotonic.

Theorem 1 (Ordered bias units). By minimizing the loss func-tion defined in Eq. 3, the optimal solution (W∗,b∗) satisfiesb∗1 ≥ b∗2 ≥ . . . ≥ b∗K−1.

Proof. Suppose (W, b) is an optimal solution and bk < bk+1 forsome k. Claim: replacing bk with bk+1 , or replacing bk+1 withbk, decreases the objective value L. Let

A1 = {n : y(k)n = y(k+1)

n = 1},

A2 = {n : y(k)n = y(k+1)

n = 0},

A3 = {n : y(k)n = 1, y(k+1)

n = 0}.

By the ordering relationship, we have

A1 ∪ A2 ∪ A3 = {1, 2, . . . ,N}.

Denote pn(bk) = σ(g(xn,W) + bk) and

δn = log(pn(bk+1)) − log(pn(bk)),δ ′n = log(1 − pn(bk)) − log(1 − pn(bk+1)).

Since pn(bk) is increasing in bk, we have δn > 0 and δ ′n > 0.If we replace bk with bk+1, the loss terms related to the k-th taskare updated. The change of loss L (Eq. 3) is given as

∆1L = λ(k)[ −∑n∈A1

δn +∑n∈A2

δ ′n −∑n∈A3

δn].

Accordingly, if we replace bk+1 with bk, the change of L is givenas

∆2L = λ(k+1)[ ∑n∈A1

δn −∑n∈A2

δ ′n −∑n∈A3

δ ′n].

By adding 1λ(k) ∆1L and 1

λ(k+1) ∆2L, we have

1λ(k) ∆1L +

1λ(k+1) ∆2L = −

∑n∈A3

(δn + δ ′n) < 0,

and know that either ∆1L < 0 or ∆2L < 0. Thus, our claim isjustified. We conclude that any optimal solution (W∗, b∗) thatminimizes L satisfies

b∗1 ≥ b∗2 ≥ . . . ≥ b∗K−1.

Note that the theorem for rank-monotonicity proposed by Liand Lin (2007), in contrast to Theorem 1, requires a cost ma-trix C with each row yn being convex. Under this convexitycondition, let λ(k)

yn = |Cyn,rk −Cyn,rk+1 | be the weight of the loss as-sociated with the k-th task on the n-th training example, whichdepends on the label yn. Li and Lin (2007) proved that by using

training example-specific task weights λ(k)yn , the optimal thresh-

olds are ordered – Niu et al. (2016) noted that example-specifictask weights are infeasible in practice. Moreover, this assump-tion requires that λ(k)

yn ≥ λ(k+1)yn when rk+1 < yn and λ(k)

yn ≤ λ(k+1)yn

when rk+1 > yn. Theorem 1 is free from this requirement andallows us to choose a fixed weight for each task that does not de-pend on the individual training examples, which greatly reducesthe training complexity. Also, Theorem 1 allows for choosingeither a simple uniform task weighting or taking dataset im-balances into account under the guarantee of non-decreasingpredicted probabilities and consistent task predictions. UnderTheorem 1, the only requirement for guaranteeing rank mono-tonicity is that the task weights are non-negative.

4. Experiments

4.1. Datasets and Preprocessing

The MORPH-2 dataset (Ricanek and Tesafaye, 2006), con-taining 55,608 face images, was downloaded from https:

//www.faceaginggroup.com/morph/ and preprocessed bylocating the average eye-position in the respective dataset usingfacial landmark detection (Sagonas et al., 2016) and then align-ing each image in the dataset to the average eye position usingEyepadAlign function in MLxtend v0.14 (Raschka, 2018). Thefaces were then re-aligned such that the tip of the nose was lo-cated in the center of each image. The age labels used in thisstudy were in the range of 16-70 years.

The CACD dataset (Chen et al., 2014) was downloadedfrom http://bcsiriuschen.github.io/CARC/ and pre-processed similar to MORPH-2 such that the faces spanned thewhole image with the nose tip at the center. The total numberof images is 159,449 in the age range of 14-62 years.

The Asian Face Database (AFAD) by Niu et al. (2016)was obtained from https://github.com/afad-dataset/

tarball. The AFAD database used in this study contained165,501 faces in the range of 15-40 years. Since the faces werealready centered, no further preprocessing was required.

Following the procedure described in Niu et al. (2016), eachimage database was randomly divided into 80% training dataand 20% test data. All images were resized to 128×128×3 pix-els and then randomly cropped to 120×120×3 pixels to augmentthe model training. During model evaluation, the 128×128×3RGB face images were center-cropped to a model input size of120×120×3.

We share the training and test partitions for all datasets, alongwith all preprocessing code used in this paper in the code repos-itory (Section 4.4).

4.2. Convolutional Neural Network Architectures

To evaluate the performance of CORAL for age estimationfrom face images, we chose the ResNet-34 architecture (Heet al., 2016), which is a modern CNN architecture that achievesgood performance on a variety of image classification tasks. Forthe remainder of this paper, we refer to the original ResNet-34 CNN with cross-entropy loss as CE-CNN. To implementa ResNet-34 CNN for ordinal regression using the proposed

Page 5: Rank-consistent ordinal regression for neural networks · 1 Rank-consistent ordinal regression for neural networks WenzhiCaoa, VahidMirjalilib, SebastianRaschkaa, aUniversity of Wisconsin-Madison,

5

CORAL method, we replaced the last output layer with the cor-responding binary tasks (Figure 1) and refer to this implemen-tation as CORAL-CNN. Similar to CORAL-CNN, we modifiedthe output layer of ResNet-34 to implement the ordinal regres-sion reference approach described in (Niu et al., 2016); we referto this architecture as OR-CNN.

4.3. Training and Evaluation

For model evaluation and comparison, we computed themean absolute error (MAE) and root mean squared error(RMSE), on the test set after the last training epoch:

MAE =1N

N∑i=1

|yi − h(xi)|,

RMSE =

√√√1N

N∑i=1

(yi − h(xi))2,

where yi is the ground truth rank of the i-th test example andh(xi) is the predicted rank, respectively.

The model training was repeated three times with differentrandom seeds (0, 1, and 2) for model weight initialization,while the random seeds were consistent between the differentmethods to allow fair comparisons. All CNNs were trained for200 epochs with stochastic gradient descent via adaptive mo-ment estimation (Kingma and Ba, 2015) using exponential de-cay rates β0 = 0.90 and β2 = 0.99 (default settings) and a batchsize of 256. To avoid introducing empirical bias by designingour own CNN architecture for comparing the ordinal regressionapproaches, we adopted a standard architecture (ResNet-34 (Heet al., 2016); Section 4.2) for this comparison. Moreover, wechose a uniform task weighting for the cross-entropy of K − 1binary classifiers in CORAL-CNN, i.e., we set ∀k : λ(k) = 1 inEq. 3.

The learning rate was determined by hyperparameter tuningon the validation set. For the various losses (cross-entropy,ordinal regression CNN (Niu et al., 2016), and the proposedCORAL method), we found that a learning rate of α = 5 × 10−5

performed best across all models, which is likely due to us-ing the same base architecture (ResNet-34). All models weretrained for 200 epochs. From those 200 epochs, the best modelwas selected via MAE performance on the validation set. Theselected model was then evaluated on the independent test set,from which the reported MAE and RMSE performance val-ues were obtained. For all reported model performances, wereported the best test set performance within the 200 trainingepochs. We provide the complete training logs in the sourcecode repository (Section 4.4).

4.4. Hardware and Software

All loss functions and neural network models wereimplemented in PyTorch 1.5 (Paszke et al., 2019)and trained on NVIDIA GeForce 2080Ti graph-ics cards. The source code is available at https:

//github.com/Raschka-research-group/coral-cnn.

Fig. 2. Graphs of the predicted probabilities for each binary classifier taskon four different examples from the MORPH-2 test dataset. In all cases,OR-CNN suffers from one or more inconsistencies (indicated by arrows) incontrast to CORAL-CNN.

5. Results and Discussion

We conducted a series of experiments on three independentface image datasets for age estimation (Section 4.1) to comparethe proposed CORAL method (CORAL-CNN) with the ordinalregression approach proposed by Niu et al. (2016) (OR-CNN).All implementations were based on the ResNet-34 architecture,as described in Section 4.2. We include the standard ResNet-34classification network with cross-entropy loss (CE-CNN) as aperformance baseline.

5.1. Estimating the Apparent Age from Face ImagesAcross all ordinal regression datasets (Table 1) we found

that both OR-CNN and CORAL-CNN outperform the standardcross-entropy classification loss (CE-CNN), which does not uti-lize the rank ordering information. Similarly, as summarized inTable 1, the proposed rank-consistent CORAL method showsa substantial performance improvement over OR-CNN (Niuet al., 2016), which does not guarantee classifier consistency.

Moreover, we repeated each experiment three times usingdifferent random seeds for model weight initialization anddataset shuffling to ensure that the observed performance im-provement of CORAL-CNN over OR-CNN is reproducible andnot coincidental. We can conclude that guaranteed classifierconsistency via CORAL has a noticeable positive effect on thepredictive performance of an ordinal regression CNN (a moredetailed analysis of the OR-CNN’s rank inconsistency is pro-vided in Section 5.2).

For all methods (CE-CNN, CORAL-CNN, and OR-CNN),the overall performance on the different datasets appeared inthe following order: MORPH-2 > AFAD > CACD (Table 1).A possible explanation is that MORPH-2 has the best overallimage quality, and the photos were taken under relatively con-sistent lighting conditions and viewing angles. For instance,we found that AFAD includes images with very low resolutions(e.g., 20x20). CACD also contains some lower-quality images.Because CACD has approximately the same size as AFAD, theoverall lower performance achieved on this dataset may also beexplained by the wider age range that needs to be considered(CACD: 14-62 years, AFAD: 15-40 years).

Page 6: Rank-consistent ordinal regression for neural networks · 1 Rank-consistent ordinal regression for neural networks WenzhiCaoa, VahidMirjalilib, SebastianRaschkaa, aUniversity of Wisconsin-Madison,

6

Table 1. Age prediction errors on the test sets. All models are based on the ResNet-34 architecture.

Method RandomSeed

MORPH-2 AFAD CACDMAE RMSE MAE RMSE MAE RMSE

CE-CNN

0 3.26 4.62 3.58 5.01 5.74 8.201 3.36 4.77 3.58 5.01 5.68 8.092 3.39 4.84 3.62 5.06 5.53 7.92

AVG ± SD 3.34 ± 0.07 4.74 ± 0.11 3.60 ± 0.02 5.03 ± 0.03 5.65 ± 0.11 8.07 ± 0.14

OR-CNN(Niu et al., 2016)

0 2.87 4.08 3.56 4.80 5.36 7.611 2.81 3.97 3.48 4.68 5.40 7.782 2.82 3.87 3.50 4.78 5.37 7.70

AVG ± SD 2.83 ± 0.03 3.97 ± 0.11 3.51 ± 0.04 4.75 ± 0.06 5.38 ± 0.02 7.70 ± 0.09

CORAL-CNN(ours)

0 2.66 3.69 3.42 4.65 5.25 7.411 2.64 3.64 3.51 4.76 5.25 7.502 2.62 3.62 3.48 4.73 5.24 7.52

AVG ± SD 2.64 ± 0.02 3.65 ± 0.04 3.47 ± 0.05 4.71 ± 0.06 5.25 ± 0.01 7.48 ± 0.06

Table 2. Average numbers of inconsistencies occurred on the different test datasets for CORAL-CNN and Niu et al’s Ordinal CNN. The penultimate columnand last column list the average numbers of inconsistencies focusing only on the correct and incorrect age predictions, respectively.

CORAL-CNN OR-CNN (Niu et al., 2016) OR-CNN (Niu et al., 2016) OR-CNN (Niu et al., 2016)All predictions All predictions Only correct predictions Only incorrect predictions

MorphSeed 0 0 2.28 1.80 2.37Seed 1 0 2.08 1.70 2.15Seed 2 0 0.86 0.65 0.89AFADSeed 0 0 1.97 1.88 1.98Seed 1 0 1.91 1.81 1.92Seed 2 0 1.17 1.02 1.19CACDSeed 0 0 1.24 0.98 1.26Seed 1 0 1.68 1.29 1.71Seed 2 0 0.80 0.63 0.81

Page 7: Rank-consistent ordinal regression for neural networks · 1 Rank-consistent ordinal regression for neural networks WenzhiCaoa, VahidMirjalilib, SebastianRaschkaa, aUniversity of Wisconsin-Madison,

7

5.2. Empirical Rank Inconsistency Analysis

By design, our proposed CORAL guarantees rank consis-tency (Theorem 1). In addition, we analyzed the rank incon-sistency empirically for both CORAL-CNN and OR-CNN (anexample of rank inconsistency is shown in Figure 2). Table 2summarizes the average numbers of rank inconsistencies forthe OR-CNN and CORAL-CNN models on each test dataset.As expected, CORAL-CNN has 0 rank inconsistencies. Whencomparing the average numbers of rank inconsistencies consid-ering only those cases where OR-CNN predicted the age cor-rectly versus incorrectly, the average number of inconsistenciesis higher when OR-CNN makes wrong predictions. This obser-vation can be seen as evidence that rank inconsistency harmspredictive performance. Consequently, this finding suggeststhat addressing rank inconsistency via CORAL is beneficial forthe predictive performance of ordinal regression CNNs.

6. Conclusions

In this paper, we developed the CORAL framework for ordi-nal regression via extended binary classification with theoreti-cal guarantees for classifier consistency. Moreover, we provedclassifier consistency without requiring rank- or training label-dependent weighting schemes, which permits straightforwardimplementations and efficient model training. CORAL can bereadily implemented to extend common CNN architectures forordinal regression tasks. The experimental results showed thatthe CORAL framework substantially improved the predictiveperformance of CNNs for age estimation on three independentage estimation datasets. Our method can be readily generalizedto other ordinal regression problems and different types of neu-ral network architectures, including multilayer perceptrons andrecurrent neural networks.

7. Acknowledgements

This research was supported by the Office of the Vice Chan-cellor for Research and Graduate Education at the University ofWisconsin-Madison with funding from the Wisconsin AlumniResearch Foundation. Also, we thank the NVIDIA Corporationfor a GPU grant to support this study.

References

Baccianella, S., Esuli, A., Sebastiani, F., 2009. Evaluation measures for ordinalregression, in: 2009 Ninth International Conference on Intelligent SystemsDesign and applications, IEEE. pp. 283–287.

Cao, D., Lei, Z., Zhang, Z., Feng, J., Li, S.Z., 2012. Human age estima-tion using ranking svm, in: Chinese Conference on Biometric Recognition,Springer. pp. 324–331.

Chang, K.Y., Chen, C.S., Hung, Y.P., 2011. Ordinal hyperplanes ranker withcost sensitivities for age estimation, in: Proceedings of the IEEE Conferenceon Computer Vision and Pattern Recognition, pp. 585–592.

Chen, B.C., Chen, C.S., Hsu, W.H., 2014. Cross-age reference coding for age-invariant face recognition and retrieval, in: Proceedings of the EuropeanConference on Computer Vision, Springer. pp. 768–783.

Chen, S., Zhang, C., Dong, M., Le, J., Rao, M., 2017. Using Ranking-CNNfor age estimation, in: Proceedings of the IEEE Conference on ComputerVision and Pattern Recognition, pp. 5183–5192.

Chu, W., Keerthi, S.S., 2005. New approaches to support vector ordinal regres-sion, in: Proceedings of the International Conference on Machine Learning,ACM. pp. 145–152.

Crammer, K., Singer, Y., 2002. Pranking with ranking, in: Advances in NeuralInformation Processing Systems, pp. 641–647.

He, K., Zhang, X., Ren, S., Sun, J., 2016. Deep residual learning for imagerecognition, in: Proceedings of the IEEE Conference on Computer Visionand Pattern Recognition, pp. 770–778.

Herbrich, R., Graepel, T., Obermayer, K., 1999. Support vector learning for or-dinal regression, in: Proceedings of the IET Conference on Artificial NeuralNetworks, pp. 97–102.

Kingma, D.P., Ba, J., 2015. Adam: A method for stochastic optimization,in: Bengio, Y., LeCun, Y. (Eds.), 3rd International Conference on LearningRepresentations, ICLR 2015, San Diego, CA, USA, May 7-9, 2015, Confer-ence Track Proceedings, pp. 1–8. URL: http://arxiv.org/abs/1412.6980.

Levi, G., Hassner, T., 2015. Age and gender classification using convolutionalneural networks, in: Proceedings of the IEEE Conference on Computer Vi-sion and Pattern Recognition Workshops, pp. 34–42.

Li, C., Liu, Q., Liu, J., Lu, H., 2012. Learning ordinal discriminative featuresfor age estimation, in: 2012 IEEE Conference on Computer Vision and Pat-tern Recognition, IEEE. pp. 2570–2577.

Li, L., Lin, H.T., 2007. Ordinal regression by extended binary classification, in:Advances in Neural Information Processing Systems, pp. 865–872.

McCullagh, P., 1980. Regression models for ordinal data. Journal of the RoyalStatistical Society. Series B (Methodological) , 109–142.

Niu, Z., Zhou, M., Wang, L., Gao, X., Hua, G., 2016. Ordinal regressionwith multiple output cnn for age estimation, in: Proceedings of the IEEEConference on Computer Vision and Pattern Recognition, pp. 4920–4928.

Paszke, A., Gross, S., Massa, F., Lerer, A., Bradbury, J., Chanan, G., Killeen,T., Lin, Z., Gimelshein, N., Antiga, L., Desmaison, A., Kopf, A., Yang, E.,DeVito, Z., Raison, M., Tejani, A., Chilamkurthy, S., Steiner, B., Fang, L.,Bai, J., Chintala, S., 2019. Pytorch: An imperative style, high-performancedeep learning library, in: Advances in Neural Information Processing Sys-tems 32, Curran Associates, Inc.. pp. 8024–8035.

Polania, L., Fung, G., Wang, D., 2019. Ordinal regression using noisy pairwisecomparisons for body mass index range estimation, in: 2019 IEEE WinterConference on Applications of Computer Vision (WACV), IEEE. pp. 782–790.

Rajaram, S., Garg, A., Zhou, X.S., Huang, T.S., 2003. Classification approachtowards ranking and sorting problems, in: Proceedings of the European Con-ference on Machine Learning, Springer. pp. 301–312.

Ramanathan, N., Chellappa, R., Biswas, S., 2009. Age progression in humanfaces: A survey. Journal of Visual Languages and Computing 15, 3349–3361.

Ranjan, R., Sankaranarayanan, S., Castillo, C.D., Chellappa, R., 2017. An all-in-one convolutional neural network for face analysis, in: Proceedings of theIEEE Conference on Automatic Face & Gesture Recognition, pp. 17–24.

Raschka, S., 2018. MLxtend: Providing machine learning and data scienceutilities and extensions to Python’s scientific computing stack. The Journalof Open Source Software 3, 1–2.

Raschka, S., Mirjalili, V., 2019. Python Machine Learning, 3rd Ed. PacktPublishing, Birmingham, UK.

Ricanek, K., Tesafaye, T., 2006. Morph: A longitudinal image database ofnormal adult age-progression, in: Proceedings of the IEEE Conference onAutomatic Face and Gesture Recognition, pp. 341–345.

Rothe, R., Timofte, R., Van Gool, L., 2015. DEX: Deep expectation of appar-ent age from a single image, in: Proceedings of the IEEE Conference onComputer Vision Workshops, pp. 10–15.

Sagonas, C., Antonakos, E., Tzimiropoulos, G., Zafeiriou, S., Pantic, M., 2016.300 faces in-the-wild challenge: database and results. Image and VisionComputing 47, 3–18.

Shashua, A., Levin, A., 2003. Ranking with large margin principle: Two ap-proaches, in: Advances in Neural Information Processing Systems, pp. 961–968.

Shen, L., Joshi, A.K., 2005. Ranking and reranking with perceptron. MachineLearning 60, 73–96.

Yang, P., Zhong, L., Metaxas, D., 2010. Ranking model for facial age estima-tion, in: 2010 International Conference on Pattern Recognition, IEEE. pp.3404–3407.

Page 8: Rank-consistent ordinal regression for neural networks · 1 Rank-consistent ordinal regression for neural networks WenzhiCaoa, VahidMirjalilib, SebastianRaschkaa, aUniversity of Wisconsin-Madison,

8

8. Supplementary Material

8.1. Generalization Bounds

Based on well-known generalization bounds for binary clas-sification, we can derive new generalization bounds for our or-dinal regression approach that apply to a wide range of prac-tical scenarios as we only require Cy,rk = 0 if rk = y andCy,rk > 0 if rk 6= y. Moreover, Theorem 2 shows that if eachbinary classification task in our model generalizes well in termsof the standard 0/1-loss, the final rank prediction via h (Eq. 1)also generalizes well.

Theorem 2 (reduction of generalization error). Suppose C isthe cost matrix of the original ordinal label prediction problem,with Cy,y = 0 and Cy,rk > 0 for k 6= y. P is the underlyingdistribution of (x, y), i.e., (x, y) ∼ P. If the binary classificationrules { fk}K−1

k=1 obtained by optimizing Eq. 3 are rank-monotonic,then

E(x,y)∼P

Cy,h(x) ≤∑K−1

k=1 |Cy,rk − Cy,rk+1 | E(x,y)∼P

1{ fk(x) 6= y(k)}.(6)

Proof. For any x ∈ X, we have

f1(x) ≥ f2(x) ≥ . . . ≥ fK−1(x).

If h(x) = y, then Cy,h(x) = 0.If h(x) = rq ≺ y = rs, then q < s. We have

f1(x) = f2(x) = . . . = fq−1(x) = 1

andfq(x) = fq+1(x) = . . . = fK−1(x) = 0.

Also,y(1) = y(2) = . . . = y(s−1) = 1

andy(s) = y(s+1) = . . . = y(K−1) = 0.

Thus, 1{ fk(x) 6= y(k)} = 1 if and only if q ≤ k ≤ s − 1. SinceCy,y = 0,

Cy,h(x) =

s−1∑k=q

(Cy,rk − Cy,rk+1 ) · 1{ fk(x) 6= y(k)}

s−1∑k=q

|Cy,rk − Cy,rk+1 | · 1{ fk(x) 6= y(k)}

K−1∑k=1

|Cy,rk − Cy,rk+1 | · 1{ fk(x) 6= y(k)}.

Similarly, if h(x) = rq � y = rs, then q > s and

Cy,h(x) =

q−1∑k=s

(Cy,rk+1 − Cy,rk ) · 1{ fk(x) 6= y(k)}

K−1∑k=1

|Cy,rk+1 − Cy,rk | · 1{ fk(x) 6= y(k)}.

In any case, we have

Cy,h(x) ≤

K−1∑k=1

|Cy,rk − Cy,rk+1 | · 1{ fk(x) = y(k)}.

By taking the expectation on both sides with (x, y) ∼ P, wearrive at Eq. (6).

In Li and Lin (2007), by assuming the cost matrix to have V-shaped rows, the researchers define generalization bounds byconstructing a discrete distribution on {1, 2, . . . ,K − 1} con-ditional on each y, given that the binary classifications arerank-monotonic or every row of C is convex. However, theonly case they provided for the existence of rank-monotonicbinary classifiers was the ordered threshold model, which re-quires a cost matrix with convex rows and example-specifictask weights. In other words, when the cost matrix is onlyV-shaped but does not meet the convex row condition, i.e.,Cy,rk − Cy,rk−1 > Cy,rk+1 − Cy,rk > 0 for some rk > y, the methodproposed in Li and Lin (2007) did not provide a practical wayto bound the generalization error. Consequently, our result doesnot rely on cost matrices with V-shaped or convex rows and canbe applied to a broader variety of real-world use cases.