This work concerns the Proper Generalized Decomposition (PGD) which is used to solve problems defined over the time-space domain and which are possibly nonlinear. The PGD is an a priori model reduction technique which allows to decrease CPU costs drastically by seeking the solution of a problem in a reduced-order basis generated automatically by a dedicated algorithm. The algorithm which is used herein is the LATIN method, a non incremental iterative strategy which enables to generate the approximations of the solution over the entire time-space domain by successive enrichments. The problematics which is addressed in this paper is the construction of the resulting reduced-order models along the iterations, which can represent a large part of the remaining global CPU cost. To make easier the construction of these models, we propose an algebraic framework adapted to PGD which allows to define a “compressed” version of the data. The space of the compressed fields exhibits some very interesting properties that lead to an important increase of the performances of the global strategy.