relu Apply rectified linear unit activation Y = relu(X) computes the relu activation for the input data X. X can be a labeled or an unlabeled dlarray. If X is a labeled dlarray, output Y is a labeled dlarray with the same dimension labels as X. If X is an unlabeled dlarray, output Y is an unlabeled dlarray. The relu operation performs a threshold operation, where any input value less than zero is set to zero. This is equivalent to: Y = X; % If X > 0 Y = 0; % If X