"State" here refers to a snapshot view of what's going on in a complex dynamic system.
A branching state is one that can go two or more different ways, which is to say one for which subsequent states are not just unpredictable but intrinsically indeterminate. There is a trivial sense in which this is always true, due to the impossibility of knowing precisely what's happening in any real complex system (as opposed to a computer simulation), but that's not what I'm talking about here.
A branching state is one that falls within the range of precursory conditions for more than one distinct outcome. In practical terms the art of creating and maintaining options is one of intentionally creating a branching state and delaying its resolution into commitment to a specific outcome. This is sometimes referred to as hedging one's bets, and accomplishing it without duplicate expenditure is the essence of good business management.
But branching states don't belong only to business; they're found in particle physics, music, and even the martial arts (techniques that originate in or pass through a common state which serves as an opportunity to switch between them). It's one of those general principles that form the vocabulary of systems.
Now for the point. I have a hunch humanity as a whole is either in or fast approaching a branching state, one that could go any of several very different ways. This is both scary and cause for hope. Change is a given, of course; it's the character of that change that remains in doubt.
Perhaps this would be a good time to take a lesson from business management, cultivating patience for the ambiguity of the situation, in the faith that the choices before us will become clearer with time, doing what seems best in the meantime.
Sunday, December 26, 2010
Thursday, December 23, 2010
it's not about Steve: finger/moon
Steve Jobs isn't so proud that he's above taking in some good feelings from all the adulation he's received in recent years, and being named "person of the year" by Financial Times has got to feel good, but I imagine he feels a little guilty about it, not because he doesn't deserve it, but because it's another example of missing the point.
Steve Jobs isn't about Steve Jobs; he's about all of the cool stuff he gets to help bring into existence, with special emphasis on what hasn't yet been publicly revealed; he's about assembling a great team, excoriating them when they get sidetracked or confused, and giving them room to run when they're on the track of something worthy of the name Apple; he's about building Apple into a persistent force for thinking different.
To focus on the man in preference to the substance of his vision is akin to focusing on the finger pointing at the moon rather than follow its lead to the moon itself.
Steve Jobs isn't about Steve Jobs; he's about all of the cool stuff he gets to help bring into existence, with special emphasis on what hasn't yet been publicly revealed; he's about assembling a great team, excoriating them when they get sidetracked or confused, and giving them room to run when they're on the track of something worthy of the name Apple; he's about building Apple into a persistent force for thinking different.
To focus on the man in preference to the substance of his vision is akin to focusing on the finger pointing at the moon rather than follow its lead to the moon itself.
Tuesday, December 21, 2010
journalistic justice, the hard-won scoop
Architosh, which claims to be "the leading Internet magazine dedicated to Mac CAD and 3D professionals and students worldwide" (with good reason), and the third party most responsible for keeping the idea of AutoCAD on the Mac alive, has published an extensive interview with Rob Maguire, AutoCAD for Mac Product Manager, who headed the team that implemented AutoCAD on Mac OS X (page 1 of 6).
Sunday, December 19, 2010
where to from Avatar?
If you already know what James Cameron has up his sleeve for the sequel(s) of Avatar, you might want to skip this post, because it's not about where he'll actually, eventually decide to go with the story, but about the constraints and choices he faces in making those decisions.
To begin with, you have a planet occupied by the Na'vi, who've had their moment of unmistakable first contact with an alien race, and won't be able to return to the innocence that preceded that moment. Moreover, as the first film ends, there's a small contingent of those aliens still on the planet, and most likely a few interstellar ships on the way at near lightspeed, with nowhere else to go other than to return to Earth, something they might not be able to do immediately or without resources from the planet. The Na'vi can cling to their nature-based ways, and their communion with Eywa, but they cannot forget that they are not alone in the cosmos, a fact about which they must surely be reminded each time their sun sets to reveal a sky half-lit by the gas giant about which their moon orbits, and the other moons which share it.
More importantly, the victory they've won is temporary. If RDA decided to take retribution, they could do it from the safety of space, by simply throwing rocks, which would arrive as meteorites, at Na'vi settlements and other strategic locations, beginning with their own base, to prevent news of the attack from getting back to Earth. To really defend themselves, the Na'vi would need a space fleet capable of intercepting and destroying incoming rocks or missiles before they reached target, and quickly. Perhaps Eywa could help with such a mobilization, particularly with accomplishing it without sacrificing their essential selves, embedded in the biology of Pandora as they are. Perhaps the tendrils with which they accomplish tsaheylu might be employed as a means of rapid instruction in science and technology. Perhaps a small percentage of the Na'vi might show an aptitude for such learning that would qualify them as geniuses on Earth, rapidly progressing beyond what they'd been taught to break new theoretical ground.
For Jake and Neytiri, there's the question of how much of Jake survives in the body to which his mind and soul transferred with Eywa's help, and whether he shares that body with echoes of his dead twin brother, the scientist, for whom the body was created and who presumably spent several hundred hours driving it before his untimely death, and before Jake. There's also the question of whether Jake can rise to the moment when doing so means making use of his celebrity to lead the Na'vi into a time of changes they cannot avoid, one which will continue long after his death.
There's a lot of sequel material there. How much of it translates well to a film the expectations for which are preconditioned by what was largely an action movie set on another planet remains an open question.
To begin with, you have a planet occupied by the Na'vi, who've had their moment of unmistakable first contact with an alien race, and won't be able to return to the innocence that preceded that moment. Moreover, as the first film ends, there's a small contingent of those aliens still on the planet, and most likely a few interstellar ships on the way at near lightspeed, with nowhere else to go other than to return to Earth, something they might not be able to do immediately or without resources from the planet. The Na'vi can cling to their nature-based ways, and their communion with Eywa, but they cannot forget that they are not alone in the cosmos, a fact about which they must surely be reminded each time their sun sets to reveal a sky half-lit by the gas giant about which their moon orbits, and the other moons which share it.
More importantly, the victory they've won is temporary. If RDA decided to take retribution, they could do it from the safety of space, by simply throwing rocks, which would arrive as meteorites, at Na'vi settlements and other strategic locations, beginning with their own base, to prevent news of the attack from getting back to Earth. To really defend themselves, the Na'vi would need a space fleet capable of intercepting and destroying incoming rocks or missiles before they reached target, and quickly. Perhaps Eywa could help with such a mobilization, particularly with accomplishing it without sacrificing their essential selves, embedded in the biology of Pandora as they are. Perhaps the tendrils with which they accomplish tsaheylu might be employed as a means of rapid instruction in science and technology. Perhaps a small percentage of the Na'vi might show an aptitude for such learning that would qualify them as geniuses on Earth, rapidly progressing beyond what they'd been taught to break new theoretical ground.
For Jake and Neytiri, there's the question of how much of Jake survives in the body to which his mind and soul transferred with Eywa's help, and whether he shares that body with echoes of his dead twin brother, the scientist, for whom the body was created and who presumably spent several hundred hours driving it before his untimely death, and before Jake. There's also the question of whether Jake can rise to the moment when doing so means making use of his celebrity to lead the Na'vi into a time of changes they cannot avoid, one which will continue long after his death.
There's a lot of sequel material there. How much of it translates well to a film the expectations for which are preconditioned by what was largely an action movie set on another planet remains an open question.
Friday, December 17, 2010
why we haven't yet seen real social computing
Yes, people want to share discoveries and experiences with others, particularly with their friends, but not necessarily with their "friends" as defined by the social computing service du jour, and, in most cases, emphatically not with that service interjecting itself into the relationship.
A real social computing system would be more ubiquitous than the telephone network, and easier to use than the postal network. It would, at least in principle, include everyone on the planet in one way or another, even those living outside the reach of ground-based communications networks and on the economic fringe, unable to afford a phone much less a computer and a satellite datalink. It would be all about allowing people to connect with the other people with whom they wanted to connect, individually, in groups, and in context, as well as to avoid the whole range of threats and parasites. And, as much as possible, it would get out of the way and allow those connections to play out as naturally as possible.
The closest thing we have to this at the moment is internet mail, which, rather than being a proprietary service, relies upon the interoperability of thousands of services, based on a collection of standard protocols. For all of its inadequacies, email is the best available model.
That's not to say email should or even could serve as the basis for that social computing environment of the future, which is likely to require a fresh start. But as a standards-based experiment in interoperability, it can serve as a starting point for thinking about what might be required.
A real social computing system would be more ubiquitous than the telephone network, and easier to use than the postal network. It would, at least in principle, include everyone on the planet in one way or another, even those living outside the reach of ground-based communications networks and on the economic fringe, unable to afford a phone much less a computer and a satellite datalink. It would be all about allowing people to connect with the other people with whom they wanted to connect, individually, in groups, and in context, as well as to avoid the whole range of threats and parasites. And, as much as possible, it would get out of the way and allow those connections to play out as naturally as possible.
The closest thing we have to this at the moment is internet mail, which, rather than being a proprietary service, relies upon the interoperability of thousands of services, based on a collection of standard protocols. For all of its inadequacies, email is the best available model.
That's not to say email should or even could serve as the basis for that social computing environment of the future, which is likely to require a fresh start. But as a standards-based experiment in interoperability, it can serve as a starting point for thinking about what might be required.
Wednesday, December 15, 2010
the impossible (but inevitable) takes a little longer
While I'm not familiar with the company or their technology, the acquisition of Caustic Graphics by Apple supplier Imagination Technologies seems like a very good thing, not least because Imagination Tech's own products integrate well with ARM processors.
My interest in computing really got started in 1983, just prior to the introduction of the Macintosh in early 1984, and during the mid-80s I took several CS classes at Colorado State University. During that time I attended at least one meeting of the student chapter of the ACM, the advisor of which had come to Colorado State from the University of Utah and was a graphics specialist. He showed the group a ray-traced cartoon, which looked very realistic except that the characters were obviously composited from simple geometric figures (spheres, cylinders, and cones), and the motion betrayed a lack of application of the physics of mass, gravity, force, and momentum. For that time it was impressive.
He briefly discussed the amount of computing resources invested in its creation, the details of which I don't recall, but I was left with the impression that each frame consumed hours of CPU time. I remember commenting to him that it would be a while before we were doing that sort of thing in real time, to which his first reaction was a blank stare, as though the idea hadn't even occurred to him, followed a moment later by pointed agreement.
Twenty-five years later, it looks like that time is approaching.
My interest in computing really got started in 1983, just prior to the introduction of the Macintosh in early 1984, and during the mid-80s I took several CS classes at Colorado State University. During that time I attended at least one meeting of the student chapter of the ACM, the advisor of which had come to Colorado State from the University of Utah and was a graphics specialist. He showed the group a ray-traced cartoon, which looked very realistic except that the characters were obviously composited from simple geometric figures (spheres, cylinders, and cones), and the motion betrayed a lack of application of the physics of mass, gravity, force, and momentum. For that time it was impressive.
He briefly discussed the amount of computing resources invested in its creation, the details of which I don't recall, but I was left with the impression that each frame consumed hours of CPU time. I remember commenting to him that it would be a while before we were doing that sort of thing in real time, to which his first reaction was a blank stare, as though the idea hadn't even occurred to him, followed a moment later by pointed agreement.
Twenty-five years later, it looks like that time is approaching.
Sunday, December 12, 2010
265 hours of newly released Nixon tapes
Think of Richard Nixon as a smarter, gentler version of Glenn Beck. It almost works, and says more about Beck than it does about Nixon.
While 265 hours of newly released tape recordings from his administration might well make for some interesting listening, they probably won't break any new ground with regard to the character of the only President ever to be hounded out of office between elections, a fate a few others have deserved more than he did, and one that his party has attempted to serve to every Democratic president since.
Nixon's bad luck was that he won the 1968 election, instead of 1960, inheriting America's military involvement in Vietnam at its height, and a country divided over what to do about it. That he took personally the criticism that inevitably stemmed from this situation, demonizing his political opponents, reveals a weakness in what was essentially a strong character. He had other weaknesses, of course, but how many people do you know who could go through what he went through and emerge from it only moderately bitter. He was made of sterner-than-average stuff, perhaps not quite up to what we expect from our Presidents, but few are.
While 265 hours of newly released tape recordings from his administration might well make for some interesting listening, they probably won't break any new ground with regard to the character of the only President ever to be hounded out of office between elections, a fate a few others have deserved more than he did, and one that his party has attempted to serve to every Democratic president since.
Nixon's bad luck was that he won the 1968 election, instead of 1960, inheriting America's military involvement in Vietnam at its height, and a country divided over what to do about it. That he took personally the criticism that inevitably stemmed from this situation, demonizing his political opponents, reveals a weakness in what was essentially a strong character. He had other weaknesses, of course, but how many people do you know who could go through what he went through and emerge from it only moderately bitter. He was made of sterner-than-average stuff, perhaps not quite up to what we expect from our Presidents, but few are.
Saturday, December 11, 2010
tiered service suggestion for MobileMe
Charles Jade, writing in theAppleBlog, makes a suggestion complete with pricing and service specifics, for a 3-tiered version of MobileMe. In this post he makes one excellent point...
Personally, I'd go one step further, integrating the bottom tier of MobileMe with the iTunes Store, customers of which already have unique Apple IDs. The online storage associated with MobileMe could then be used to secure purchases, say at a 1:10 ratio (1 GB allotment usage for 10 GB of purchases or rentals, which wouldn't actually have to be stored redundantly in your account), in case your machine encountered some disaster and you had no local backup. Premium versions of MobileMe, offering more storage, could cover proportionally more rentals or purchases.
Customers who currently have MobileMe and iTunes accounts associated with different IDs could have both (all) Apple IDs associated with a single merged account.
Given the fiscal need to tie revenue to products and services provided, Apple could lace the free version of MobileMe with iAds, purchasing them itself if necessary to insure that the expenses involved were offset by revenue.
I'd also go one step further with the premium (family/workgroup/professional) version of MobileMe, providing it with some of the capabilities of Mac OS X Server, like the ability to create a wiki or a shared calendar in the cloud, which could be accessed by others with any type of MobileMe account, also merging in the iWork.com service. This tier should also have premium domain hosting and web authoring capabilities, like a professional version of iWeb.
Put enough value under one roof, at a price your customers perceive to be at least arguably a bargain, and there'll be many more customers than if they have difficulty justifying the purchase. This argument is far more compelling for services with relatively high up-front costs and low incremental costs than it is for hardware products with higher incremental costs.
Apple has maintained the price of MobileMe at a high enough level that they should be able to offer a complete, very sophisticated service without raising prices at all, and the cost of operating a basic subset of that service should by this time be low enough to be coverable through tasteful advertising alone, allowing it to be offered for free, with the competitive benefits that Charles Jade outlines above.
"By making MobileMe free, those using it with iOS devices won’t be using services from Google or Microsoft, which makes switching to Windows Phone 7 or Android more difficult. While PC users would also have MobileMe free, they’d need to have iOS devices to make it really worth using. The Halo Effect, which argues that iOS device sales later lead to Mac sales mitigates the loss associated with giving away MobileMe to PC users in the present. If they do switch, free MobileMe helps encourage them to remain all-Apple in the future. Free MobileMe would be an investment in hardware customer retention, and it doesn’t even have to be completely free."
Personally, I'd go one step further, integrating the bottom tier of MobileMe with the iTunes Store, customers of which already have unique Apple IDs. The online storage associated with MobileMe could then be used to secure purchases, say at a 1:10 ratio (1 GB allotment usage for 10 GB of purchases or rentals, which wouldn't actually have to be stored redundantly in your account), in case your machine encountered some disaster and you had no local backup. Premium versions of MobileMe, offering more storage, could cover proportionally more rentals or purchases.
Customers who currently have MobileMe and iTunes accounts associated with different IDs could have both (all) Apple IDs associated with a single merged account.
Given the fiscal need to tie revenue to products and services provided, Apple could lace the free version of MobileMe with iAds, purchasing them itself if necessary to insure that the expenses involved were offset by revenue.
I'd also go one step further with the premium (family/workgroup/professional) version of MobileMe, providing it with some of the capabilities of Mac OS X Server, like the ability to create a wiki or a shared calendar in the cloud, which could be accessed by others with any type of MobileMe account, also merging in the iWork.com service. This tier should also have premium domain hosting and web authoring capabilities, like a professional version of iWeb.
Put enough value under one roof, at a price your customers perceive to be at least arguably a bargain, and there'll be many more customers than if they have difficulty justifying the purchase. This argument is far more compelling for services with relatively high up-front costs and low incremental costs than it is for hardware products with higher incremental costs.
Apple has maintained the price of MobileMe at a high enough level that they should be able to offer a complete, very sophisticated service without raising prices at all, and the cost of operating a basic subset of that service should by this time be low enough to be coverable through tasteful advertising alone, allowing it to be offered for free, with the competitive benefits that Charles Jade outlines above.
Friday, December 10, 2010
Apple's ripple effects, and their ripple effects
Bloomberg reports that Foxconn, Apple's largest manufacturing partner, has exceeded one million employees in China. That's approximately 1/10 of 1% of China's overall population, or one person in every thousand, who are being exposed, hands-on to Apple's high standards in design, materials, fabrication, and assembly, developing skills that are applicable elsewhere, and earning enough to be able to send a little home, save a bit, or make the occasional purchase of some non-essential product, like the iOS devices they're building. They're also participating in a massive exercise in logistics, as Foxconn scales up to handle the increasing demand for their primary customer's products.
Just as happened in Japan, which in the wake of World War II was the cheap, semi-skilled labor pool of the 1950s, Foxconn's workers are beginning to demand even better wages and working conditions. So long as Apple's competition is also going to China for their manufacturing needs, and accommodating the demands of workers can be accomplished without it working to the disadvantage of one player or another, workers can expect to see gradual improvement in both wages and working conditions. At some point, however, the temptation of lower labor costs elsewhere will surely result in the movement of some operations to Central or Southeast Asia, India, Africa, or South America. If that difference in costs translates to a market advantage, others will follow. No doubt Foxconn is very aware of this possibility and determined to remain competitive.
Foxconn's CEO has already stated that the company plans to eventually replace most assembly line workers with robots. Right now that's an expensive proposition, but if any company has both the means and the motivation to drive down the price of automation, it would be Foxconn. So, for some large percentage of those one million workers, their present jobs will last only so long as they don't price themselves out of the market, or until their places are taken by robots, whichever comes first. (Because the wages paid to Foxconn's workers are valued in proportion to the average income of a population 1000 times larger, those wages won't result in the same degree of inflation as occurred in Japan, so smaller wage increases can be expected, making the job loss to automation scenario more likely.)
That's, say, 800,000 at least semi-skilled workers, presumably with an acquired taste for quality, that will be released back into the Chinese labor pool, most likely gradually enough to be absorbed into other enterprises. Some of these will surely return to school to become engineers, while others will learn new trades and apply to them the uncompromising standards they're now learning by osmosis. The net effect will undoubtedly reach far beyond the wages they were paid while working for Foxconn, helping to boost China's fortunes generally and giving the country an even greater stake in maintaining stable relations with their neighbors and trading partners, improving the prospects for peace.
Meanwhile, those iOS devices they're producing will be helping to enliven minds around the world, with incalculable ripple effects.
Just as happened in Japan, which in the wake of World War II was the cheap, semi-skilled labor pool of the 1950s, Foxconn's workers are beginning to demand even better wages and working conditions. So long as Apple's competition is also going to China for their manufacturing needs, and accommodating the demands of workers can be accomplished without it working to the disadvantage of one player or another, workers can expect to see gradual improvement in both wages and working conditions. At some point, however, the temptation of lower labor costs elsewhere will surely result in the movement of some operations to Central or Southeast Asia, India, Africa, or South America. If that difference in costs translates to a market advantage, others will follow. No doubt Foxconn is very aware of this possibility and determined to remain competitive.
Foxconn's CEO has already stated that the company plans to eventually replace most assembly line workers with robots. Right now that's an expensive proposition, but if any company has both the means and the motivation to drive down the price of automation, it would be Foxconn. So, for some large percentage of those one million workers, their present jobs will last only so long as they don't price themselves out of the market, or until their places are taken by robots, whichever comes first. (Because the wages paid to Foxconn's workers are valued in proportion to the average income of a population 1000 times larger, those wages won't result in the same degree of inflation as occurred in Japan, so smaller wage increases can be expected, making the job loss to automation scenario more likely.)
That's, say, 800,000 at least semi-skilled workers, presumably with an acquired taste for quality, that will be released back into the Chinese labor pool, most likely gradually enough to be absorbed into other enterprises. Some of these will surely return to school to become engineers, while others will learn new trades and apply to them the uncompromising standards they're now learning by osmosis. The net effect will undoubtedly reach far beyond the wages they were paid while working for Foxconn, helping to boost China's fortunes generally and giving the country an even greater stake in maintaining stable relations with their neighbors and trading partners, improving the prospects for peace.
Meanwhile, those iOS devices they're producing will be helping to enliven minds around the world, with incalculable ripple effects.
Thursday, December 09, 2010
the error in Ryan's logic
In a gdgt.com article, the author, identified only as Ryan, concludes that doubling the dimensions of the iPad, as measured in pixels (quadrupling the total number of pixels) would not be enough to qualify the new device as having a "retina display" (one with a grain finer than the human retina can discern, such as the Retina Display (an Apple trademark) in the iPhone4 (another Apple trademark)). A cornerstone of his argument appears below...
That's some pretty strange reasoning.
First, I think he's wrong about most people holding phones 15 to 20 inches from their eyes, but even if he's right it's the 12-inch assumption associated with the 300 ppi (pixels-per-inch) figure that matters. If a device is held further away than 12 inches, it requires fewer than 300 ppi to saturate the retina of the human eye, at 24 inches only 150 ppi are needed to achieve the same effect, or 200 ppi at 18 inches.
At 2048 X 1536, a double-dimension iPad display is comfortably above 200 ppi, and even slightly above the 240 ppi that would be needed at 15 inches distance from the eyes, the lower limit of Ryan's own estimate of how far away people hold their iPads, so, assuming the guideline of 300 ppi at 12 inches is accurate, it would definitely qualify as a retina display.
I'm not predicting what Apple will do. The 1024 X 768 display in the first-generation iPad is already very crisp, and they just might decide to stick with it for another year, perhaps lowering prices a bit and concentrating on increasing frame rates in games, or they might increase the screen dimensions by a factor lower than 2. A 1.5 increase would mean 1536 X 1152, still qualifying as a retina display at 20-inches from the eyes, the upper limit of Ryan's range, while increasing the total number of pixels by only 2.25, a factor low enough that they could probably still manage a performance increase by going to a CPU using dual A9 cores paired with any of several GPUs. More importantly, they might be able to put all that together without a price increase that would drive many people to other platforms.
Jumping to a 2048 X 1536 display sounds risky, but if Apple's suppliers can build them fast enough, without significant delays to tweak the production process, at a low enough price, with a low enough failure rate, they just might go for it.
"Now, the retina display was so named because Apple found that "there's a magic number right around 300 pixels per inch that... is the limit of the human retina to differentiate the pixels."* This assumes holding the device about a foot from your eyes, but I think most people tend hold their phone and their iPad at roughly the same distance (between 15 and 20 inches), it it's probably fair to assume that the iPad retina display should still be somewhere around 300 PPI.
* From Steve's WWDC 2010 keynote; skip to about 36:30 minutes for the retina display introduction."
That's some pretty strange reasoning.
First, I think he's wrong about most people holding phones 15 to 20 inches from their eyes, but even if he's right it's the 12-inch assumption associated with the 300 ppi (pixels-per-inch) figure that matters. If a device is held further away than 12 inches, it requires fewer than 300 ppi to saturate the retina of the human eye, at 24 inches only 150 ppi are needed to achieve the same effect, or 200 ppi at 18 inches.
At 2048 X 1536, a double-dimension iPad display is comfortably above 200 ppi, and even slightly above the 240 ppi that would be needed at 15 inches distance from the eyes, the lower limit of Ryan's own estimate of how far away people hold their iPads, so, assuming the guideline of 300 ppi at 12 inches is accurate, it would definitely qualify as a retina display.
I'm not predicting what Apple will do. The 1024 X 768 display in the first-generation iPad is already very crisp, and they just might decide to stick with it for another year, perhaps lowering prices a bit and concentrating on increasing frame rates in games, or they might increase the screen dimensions by a factor lower than 2. A 1.5 increase would mean 1536 X 1152, still qualifying as a retina display at 20-inches from the eyes, the upper limit of Ryan's range, while increasing the total number of pixels by only 2.25, a factor low enough that they could probably still manage a performance increase by going to a CPU using dual A9 cores paired with any of several GPUs. More importantly, they might be able to put all that together without a price increase that would drive many people to other platforms.
Jumping to a 2048 X 1536 display sounds risky, but if Apple's suppliers can build them fast enough, without significant delays to tweak the production process, at a low enough price, with a low enough failure rate, they just might go for it.
Tuesday, December 07, 2010
the misleading Mac vs. iOS dichotomy
Writing for MacWorld, John Gruber makes a case for the near-term persistence of the Mac, but casts doubt on its long-term relevance, and linking to the article from his own blog, the link text he chooses is "All Good Things Must Come to an End" and the article provides an implied subtext 'just not right now.'
Why so much gloom over the Mac's future when sales are through the roof? As Gruber says himself "The irony is that there’s more doubt today about the long-term prospects of the Mac than there has been at any time since Steve Jobs returned to Apple in 1997." The driving factor isn't Windows, of course, but iOS, Mac OS X's younger, more svelte sibling. People look at sales figures for the iPad, note it is already selling faster than the Mac, and start counting the days, weeks, months, or years until the Mac's demise.
Time for a reality check!
To begin with, iOS and Mac OS X are far more similar than different. The main difference between them is in the libraries supporting the user interface, AppKit on the Mac and UIKit on iOS devices. Below that level, they're practically identical, and becoming more nearly so with each release. (Parts which were originally left out of iOS due to resource limitations can be folded back in as more capable hardware becomes the norm, and some parts which originated in iOS are finding their way into Mac OS X.) The truth is, following their initial divergence to enable support of a touchscreen interface and limited hardware, and the fact they remain on separate tracks for the time being, in the long term they will probably reconverge. I'll come back to this point.
Not very long ago, "Mac" meant a machine with a keyboard, a pointing device (mouse or trackpad), an optical drive, a few ports, and a screen at least 13" from corner to corner, with a dual-core (or larger) Intel processor combined with a multi-core dedicated graphics processor, and running Mac OS X. The MacBook Air removed the optical drive from this definition, and more recently dropped the lower limit of the screen size to 11".
On the other hand, whereas the original iPhone required a bit of hacking before you could use an external keyboard with it, the iPad had a keyboard dock available in roughly the same time frame as its own release. Obviously, Apple recognizes an on-screen keyboard isn't an acceptable substitute for a physical keyboard for many purposes. And if they haven't yet made it possible to use a mouse or trackpad with an iPad, they certainly could. If they're holding off, it's probably because they're working on a comprehensive solution for the combination of two subtly different user interaction paradigms.
This is how I see iOS and Mac OS X converging, through each gaining the ability to support the other's UI paradigm. Just as you now see keyboards connected to iPads, you might also see touchscreens connected to Macs, something which has actually been possible for awhile, thanks to Wacom, and I believe there are also apps which enable the connection of an iPad as a touchscreen peripheral, although they probably pair with specific Mac apps, rather than providing general touchscreen utility.
In very simple terms, this means building a version of iOS including AppKit, and a version of Mac OS X including UIKit. The reality is no doubt a good deal more complicated, but that's the nutshell version.
For the developer, the path to taking maximum advantage of this is through distributing app components across the whole range of devices, with each device running the components that make sense for it, given its intrinsic capabilities. For Apple this presents a choice between leaving developers to work this out for themselves in a hundred different ways or to provide a framework which makes it straightforward. I can't imagine Apple wouldn't choose the latter.
So, if the Mac disappears at all, it will be disappearing into something larger and even more powerful, pieces of which will fit in your pocket, or on your wrist, but most likely there will continue to be machines called Macs, using a keyboard and pointing device as their default paradigm, until the market for such machines shrivels up, by which time most of us will have ceased to care, else there would still be a market.
Why so much gloom over the Mac's future when sales are through the roof? As Gruber says himself "The irony is that there’s more doubt today about the long-term prospects of the Mac than there has been at any time since Steve Jobs returned to Apple in 1997." The driving factor isn't Windows, of course, but iOS, Mac OS X's younger, more svelte sibling. People look at sales figures for the iPad, note it is already selling faster than the Mac, and start counting the days, weeks, months, or years until the Mac's demise.
Time for a reality check!
To begin with, iOS and Mac OS X are far more similar than different. The main difference between them is in the libraries supporting the user interface, AppKit on the Mac and UIKit on iOS devices. Below that level, they're practically identical, and becoming more nearly so with each release. (Parts which were originally left out of iOS due to resource limitations can be folded back in as more capable hardware becomes the norm, and some parts which originated in iOS are finding their way into Mac OS X.) The truth is, following their initial divergence to enable support of a touchscreen interface and limited hardware, and the fact they remain on separate tracks for the time being, in the long term they will probably reconverge. I'll come back to this point.
Not very long ago, "Mac" meant a machine with a keyboard, a pointing device (mouse or trackpad), an optical drive, a few ports, and a screen at least 13" from corner to corner, with a dual-core (or larger) Intel processor combined with a multi-core dedicated graphics processor, and running Mac OS X. The MacBook Air removed the optical drive from this definition, and more recently dropped the lower limit of the screen size to 11".
On the other hand, whereas the original iPhone required a bit of hacking before you could use an external keyboard with it, the iPad had a keyboard dock available in roughly the same time frame as its own release. Obviously, Apple recognizes an on-screen keyboard isn't an acceptable substitute for a physical keyboard for many purposes. And if they haven't yet made it possible to use a mouse or trackpad with an iPad, they certainly could. If they're holding off, it's probably because they're working on a comprehensive solution for the combination of two subtly different user interaction paradigms.
This is how I see iOS and Mac OS X converging, through each gaining the ability to support the other's UI paradigm. Just as you now see keyboards connected to iPads, you might also see touchscreens connected to Macs, something which has actually been possible for awhile, thanks to Wacom, and I believe there are also apps which enable the connection of an iPad as a touchscreen peripheral, although they probably pair with specific Mac apps, rather than providing general touchscreen utility.
In very simple terms, this means building a version of iOS including AppKit, and a version of Mac OS X including UIKit. The reality is no doubt a good deal more complicated, but that's the nutshell version.
For the developer, the path to taking maximum advantage of this is through distributing app components across the whole range of devices, with each device running the components that make sense for it, given its intrinsic capabilities. For Apple this presents a choice between leaving developers to work this out for themselves in a hundred different ways or to provide a framework which makes it straightforward. I can't imagine Apple wouldn't choose the latter.
So, if the Mac disappears at all, it will be disappearing into something larger and even more powerful, pieces of which will fit in your pocket, or on your wrist, but most likely there will continue to be machines called Macs, using a keyboard and pointing device as their default paradigm, until the market for such machines shrivels up, by which time most of us will have ceased to care, else there would still be a market.
Saturday, December 04, 2010
RobotsPodcast.com
Have I mentioned the Robots Podcast here yet? Oh, probably, but it really does bear repetition.
Between the Robots Podcast and its predecessor, Talking Robots (scroll down), well over a hundred episodes have been archived, each of which features at least one interview with someone deeply involved in robotics or and/or a related field (animal behavior, for instance), with occasionaldetours into cultural responses to the advent of adaptive machines. Actively listening to the entire collection must be equivalent to a graduate level survey course in robotics.
Each episode has its own web page, and the links on those pages, taken together, read like a compendium of top-flight robotics programs and companies, if not comprehensive then a very good start on being so. It's a great way to get a quick overview of who is doing what, where.
Check it out: RobotsPodcast.com
Between the Robots Podcast and its predecessor, Talking Robots (scroll down), well over a hundred episodes have been archived, each of which features at least one interview with someone deeply involved in robotics or and/or a related field (animal behavior, for instance), with occasionaldetours into cultural responses to the advent of adaptive machines. Actively listening to the entire collection must be equivalent to a graduate level survey course in robotics.
Each episode has its own web page, and the links on those pages, taken together, read like a compendium of top-flight robotics programs and companies, if not comprehensive then a very good start on being so. It's a great way to get a quick overview of who is doing what, where.
Check it out: RobotsPodcast.com
will Apple sandbox Mac apps?
You know, most Window's users would hardly notice if they weren't able to use software other than Office to manipulate their Office-generated documents. They already behave very much as though Windows were a sandboxed environment, with one big sandbox and a handful of smaller ones.
That's a far cry from the experience of most Mac users. Aside from iTunes, which must be used to connect to Apple's online content and app stores, to a lesser extent Safari, which is the best browser for use with all Apple websites, including MobileMe, and Xcode for programming for the Mac and iOS, there's no one Mac application or suite of applications that dominates any use category. For whatever you might want to do, there's a choice, and it's common for Mac users to first use one software tool, then another, then another, in a workflow that makes use of the best characteristics of several programs.
This is harder to do in a sandboxed environment, like iOS was originally and still is, except as developers take advantage of provisions for file sharing between applications.
Would Apple similarly cramp multi-app workflows on the Mac?
There's probably two answers to this question, "no" and "yes".
No, we're not going to find that some major update to Mac OS X comes at the price of the inability to save a file with one app and open it in another. Not tomorrow, probably not ever.
On the other hand, Apple probably will find a way to provide some of the security that iOS gains from sandboxing, without actually imposing sandboxes, for the most part.
One way in which they've already done this is their use of property list files, which use a small set of basic object types to wrap data, and make it extremely unlikely that data will be run as code by accident, or by any program other than the one that saved it. Property list support is ubiquitous in Apple's frameworks, and they're very simple for the application programmer to use.
Something else Apple might do is to verify that saved files conform to the type they claim to be, that a JPEG is actually organized as a JPEG and not concealing something that doesn't belong. Developers that provide executable definitions for their custom file types might be given more elbow room than those who don't go to the trouble, and developers who don't go to the trouble might be presented with a choice between using standard file types and property lists exclusively or having their applications sandboxed, prompting some to cry "foul" and others to characterize them as whiners for doing so. Does the operating system have a right to know, in general terms, the content of every file? Of course it does; end of subject.
That's a plausible scenario for how Mac OS X might evolve in the wake of the advent of iOS, far more plausible than the scarecrow that Mac owners might look up from some software update to find their machines locked down. Sure, some apps might be sandboxed, until and unless their developers get with the program, but not the system as a whole, and, most likely, not anything distributed by a reputable company, for which the user paid real money; such apps would already have been updated by the time the deadline arrives.
So quit worrying and enjoy the ride, and think twice before using a custom file type that you aren't prepared to nail down with a schema, or something similar.
That's a far cry from the experience of most Mac users. Aside from iTunes, which must be used to connect to Apple's online content and app stores, to a lesser extent Safari, which is the best browser for use with all Apple websites, including MobileMe, and Xcode for programming for the Mac and iOS, there's no one Mac application or suite of applications that dominates any use category. For whatever you might want to do, there's a choice, and it's common for Mac users to first use one software tool, then another, then another, in a workflow that makes use of the best characteristics of several programs.
This is harder to do in a sandboxed environment, like iOS was originally and still is, except as developers take advantage of provisions for file sharing between applications.
Would Apple similarly cramp multi-app workflows on the Mac?
There's probably two answers to this question, "no" and "yes".
No, we're not going to find that some major update to Mac OS X comes at the price of the inability to save a file with one app and open it in another. Not tomorrow, probably not ever.
On the other hand, Apple probably will find a way to provide some of the security that iOS gains from sandboxing, without actually imposing sandboxes, for the most part.
One way in which they've already done this is their use of property list files, which use a small set of basic object types to wrap data, and make it extremely unlikely that data will be run as code by accident, or by any program other than the one that saved it. Property list support is ubiquitous in Apple's frameworks, and they're very simple for the application programmer to use.
Something else Apple might do is to verify that saved files conform to the type they claim to be, that a JPEG is actually organized as a JPEG and not concealing something that doesn't belong. Developers that provide executable definitions for their custom file types might be given more elbow room than those who don't go to the trouble, and developers who don't go to the trouble might be presented with a choice between using standard file types and property lists exclusively or having their applications sandboxed, prompting some to cry "foul" and others to characterize them as whiners for doing so. Does the operating system have a right to know, in general terms, the content of every file? Of course it does; end of subject.
That's a plausible scenario for how Mac OS X might evolve in the wake of the advent of iOS, far more plausible than the scarecrow that Mac owners might look up from some software update to find their machines locked down. Sure, some apps might be sandboxed, until and unless their developers get with the program, but not the system as a whole, and, most likely, not anything distributed by a reputable company, for which the user paid real money; such apps would already have been updated by the time the deadline arrives.
So quit worrying and enjoy the ride, and think twice before using a custom file type that you aren't prepared to nail down with a schema, or something similar.
Subscribe to:
Posts (Atom)