Programs aiming to develop large scale digital infrastructure for the humanities motivate this development mostly by the wish to leverage methodological innovation through digital and computational approaches. It is questionable, however, if large scale infrastructures are the right incubator model for bringing about such innovation. The necessary generalizations and standardizations, management and development processes that large infrastructures need to apply to cater to wholesale humanities are at odds with well-known aspects of innovation. Moreover, such generalizations close off many possibilities for exploring new modeling and computing approaches. I argue that methodological innovation and advancing the modeling of humanities data and heuristics is better served by flexible small-scale research focused development practices. It will also be shown that modeling highly specific distributed web services is a more promising avenue for sustainability of highly heterogeneous humanities digital data than standards enforcement and current encoding practices.