In an article titled Apple Thunderbolt Display teardown: So many chips it’s hard to believe there’s no computer inside 9to5mac passes along this observation:
iFixit says that both sides of the logic board are packed with so many chips “that it’s hard to believe there’s no computer inside”.
Considering that we've arguably already arrived at the point of saturation, with regard to incremental improvements in computational power producing noticeable improvements in the user experience on displays the size of the Thunderbolt Display, making use of high-end CPUs and GPUs not constrained by power dissipation, and given the inexorable migration of the performance of such high-end chips to low-power, low-cost, highly integrated chips of the sort found in the iPad, how long will it be before it simply makes no sense to build something as complex as the Thunderbolt Display without also making it a computer in its own right? Five years? Ten? (The same logic could be applied to TV tuner hardware.)