Living cells can measure chemical concentrations with remarkable accuracy, even though these measurements are inherently noisy due to the stochastic binding of the ligand to the receptor. A widely used mechanism for reducing the sensing error is to increase the effective number of measurements via receptor time integration. This mechanism is implemented via the signaling network downstream of the receptor, yet how it is implemented optimally given constraints on cellular resources such as protein copies and time remains unknown. To address this question, we employ our sampling framework [Govern and ten Wolde, Proc. Natl. Acad. Sci. USA 111, 17486 (2014)PNASA60027-842410.1073/pnas.1411524111] and extend it here to time-varying ligand concentrations. This framework starts from the observation that the signaling network implements the mechanism of time integration by discretely sampling the ligand-binding state of the receptor and storing these states into chemical modification states of the readout molecules downstream. It reveals that the sensing error has two distinct contributions: a sampling error, which is determined by the number of samples, their independence, and their accuracy, and a dynamical error, which depends on the timescale that these samples are generated. We test our previously identified design principle, which states that in an optimally designed system the number of receptors and their integration time, which determine the number of independent concentration measurements at the receptor level, equals the number of readout proteins, which store these measurements. We show that this principle is robust to the dynamics of the input and the relative costs of the receptor and readout proteins: these resources are fundamental and cannot compensate each other.