Update: Daniel follows through
Rather than wait for the next installment of RoughlyDrafted, I'm going to proceed with tackling a point that Daniel suggests he'll be addressing, that Apple is simply out of ideas.
I'll be making one crucial assumption, which is that if I can think up something useful someone at Apple has probably already thought of it.
Apple has put a lot of effort into their implementation of OpenGL, hardware acceleration included, but OpenGL is all about surfaces. It won't help you with the flexing of hair or skin, for instance, much less with the fluid dynamics of smoke rising through air or water flowing over rocks in a stream. Of course, dealing with solids, fluids, and factors like gravity and momentum is hugely complicated, and there's a wide selection of expensive software that's used by professionals in gaming, animation, architecture, and mechanical design. Apple certainly couldn't replace all of that, but by laying a little groundwork they could simplify the creation of such software and help to make versions suitable for hobbyists available. For lack of a better name, let's refer to it as Core Physics.
Core Data has already done a lot to simplify the handling and preservation of data. But giving meaning to that data is still left to the developer. One way in which the system could support this task is by making a wider range of statistical operations available as methods on basic classes, like NSNumber or NSArray. Even better would be to provide a library that starts with statistics but goes on to provide support for building object networks based on patterns in the data and putting data in context. Let's call this Core Heuristics.
Picking meaningful information out of data streams, such as those provided by a microphone or a camera, is both difficult and very useful if you can manage it. Support for doing so might be called Core Hearing and Core Vision, respectively.
Given the above, the idea of providing support for environmental/machine control practically suggests itself. For Macs, this would likely be mostly about interfacing with the several home automation standards in use, and maybe about developer tools for other types of hardware, but the possibilities don't end there. Response to touch is a big deal in the physical world, and having touch awareness built into UIKit positions OS X advantageously for use in responsive toys, for example. Granted there's a lot of work to be done to get from an iPhone to an 'iBear', but I wouldn't be too surprised to learn that Apple was already on it. Again, for lack of a better name, let's call this Core Automation.
Note that the above is all about plumbing. I haven't yet suggested anything that would be apparent to the average Mac user, and that's where I'm going to leave it for now. Maybe Daniel has some ideas along those lines.
Wednesday, June 25, 2008
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment