In theoretical particle physics, where the idea originated, renormalization was a mathematical recipe to get rid of the cufoff necessary to keep certain quantities finite. The physical meaning of the process emerges in the context of condensed matter physics, where the cutoff comes from the finite spacing of atomic lattices, and is a physical quantity. It happens that one can "bury" the cutoff, by reexpressing it in terms of observable quantities at a macroscopiic length scale. This happens when the theory is closed to a "fixed point" in the space of possible theories. The theory is then said to be renormalizable. The process of renormalization corresponds to increasing the effective legnth scale through coarse-graining.
For further discussions, see K. Huang, Quantum Field Theory, Chaps. 16, 17. Explicit illustration may be found in K. Halpern and K. Huang, "Nontrivial directions for scalar fields," Phys. Rev. D53, 3252 (1966).
A theory may be "crossing over" from one fixed point to another, as the length scale changes. In this case the cutoff cannot be buried, and the theory is not renormalizable. The physical manifestation is that a microscopic length scale is relevant.even in the macroscopic domain. On purely macroscopic terms, there appears to be "intrinsic" randomness in the system. Such is the case in the propagation of a crack. Far from the tip of the advancing crack the system may be described by a continuum model; but near the tip one must take into the atomic structure.
An approach to this problems is to introduced explicitly a distance-dependent length scale; by varying the amoung of coarse-graining depending on the distance from the tip of the crack. Actual implementation of this idea remains to be worked out.