Tensor renormalization group (TRG) constitutes an important methodology for accurate simulations of strongly correlated lattice models. Facilitated by the automatic differentiation technique widely used in deep learning, we propose a uniform framework of differentiable TRG ($∂$TRG) that can be applied to improve various TRG methods, in an automatic fashion. $∂$TRG systematically extends the essential concept of second renormalization [Phys. Rev. Lett. 103, 160601 (2009)] where the tensor environment is computed recursively in the backward iteration. Given the forward TRG process, $∂$TRG automatically finds the gradient of local tensors through backpropagation, with which one can deeply ``train'' the tensor networks. We benchmark $∂$TRG in solving the square-lattice Ising model, and we demonstrate its power by simulating one- and two-dimensional quantum systems at finite temperature. The global optimization as well as GPU acceleration renders $∂$TRG a highly efficient and accurate many-body computation approach.