Sunday, December 26, 2010

branching states

"State" here refers to a snapshot view of what's going on in a complex dynamic system.

A branching state is one that can go two or more different ways, which is to say one for which subsequent states are not just unpredictable but intrinsically indeterminate. There is a trivial sense in which this is always true, due to the impossibility of knowing precisely what's happening in any real complex system (as opposed to a computer simulation), but that's not what I'm talking about here.

A branching state is one that falls within the range of precursory conditions for more than one distinct outcome. In practical terms the art of creating and maintaining options is one of intentionally creating a branching state and delaying its resolution into commitment to a specific outcome. This is sometimes referred to as hedging one's bets, and accomplishing it without duplicate expenditure is the essence of good business management.

But branching states don't belong only to business; they're found in particle physics, music, and even the martial arts (techniques that originate in or pass through a common state which serves as an opportunity to switch between them). It's one of those general principles that form the vocabulary of systems.

Now for the point. I have a hunch humanity as a whole is either in or fast approaching a branching state, one that could go any of several very different ways. This is both scary and cause for hope. Change is a given, of course; it's the character of that change that remains in doubt.

Perhaps this would be a good time to take a lesson from business management, cultivating patience for the ambiguity of the situation, in the faith that the choices before us will become clearer with time, doing what seems best in the meantime.

Thursday, December 23, 2010

it's not about Steve: finger/moon

Steve Jobs isn't so proud that he's above taking in some good feelings from all the adulation he's received in recent years, and being named "person of the year" by Financial Times has got to feel good, but I imagine he feels a little guilty about it, not because he doesn't deserve it, but because it's another example of missing the point.

Steve Jobs isn't about Steve Jobs; he's about all of the cool stuff he gets to help bring into existence, with special emphasis on what hasn't yet been publicly revealed; he's about assembling a great team, excoriating them when they get sidetracked or confused, and giving them room to run when they're on the track of something worthy of the name Apple; he's about building Apple into a persistent force for thinking different.

To focus on the man in preference to the substance of his vision is akin to focusing on the finger pointing at the moon rather than follow its lead to the moon itself.

Tuesday, December 21, 2010

journalistic justice, the hard-won scoop

Architosh, which claims to be "the leading Internet magazine dedicated to Mac CAD and 3D professionals and students worldwide" (with good reason), and the third party most responsible for keeping the idea of AutoCAD on the Mac alive, has published an extensive interview with Rob Maguire, AutoCAD for Mac Product Manager, who headed the team that implemented AutoCAD on Mac OS X (page 1 of 6).

Sunday, December 19, 2010

where to from Avatar?

If you already know what James Cameron has up his sleeve for the sequel(s) of Avatar, you might want to skip this post, because it's not about where he'll actually, eventually decide to go with the story, but about the constraints and choices he faces in making those decisions.

To begin with, you have a planet occupied by the Na'vi, who've had their moment of unmistakable first contact with an alien race, and won't be able to return to the innocence that preceded that moment. Moreover, as the first film ends, there's a small contingent of those aliens still on the planet, and most likely a few interstellar ships on the way at near lightspeed, with nowhere else to go other than to return to Earth, something they might not be able to do immediately or without resources from the planet. The Na'vi can cling to their nature-based ways, and their communion with Eywa, but they cannot forget that they are not alone in the cosmos, a fact about which they must surely be reminded each time their sun sets to reveal a sky half-lit by the gas giant about which their moon orbits, and the other moons which share it.

More importantly, the victory they've won is temporary. If RDA decided to take retribution, they could do it from the safety of space, by simply throwing rocks, which would arrive as meteorites, at Na'vi settlements and other strategic locations, beginning with their own base, to prevent news of the attack from getting back to Earth. To really defend themselves, the Na'vi would need a space fleet capable of intercepting and destroying incoming rocks or missiles before they reached target, and quickly. Perhaps Eywa could help with such a mobilization, particularly with accomplishing it without sacrificing their essential selves, embedded in the biology of Pandora as they are. Perhaps the tendrils with which they accomplish tsaheylu might be employed as a means of rapid instruction in science and technology. Perhaps a small percentage of the Na'vi might show an aptitude for such learning that would qualify them as geniuses on Earth, rapidly progressing beyond what they'd been taught to break new theoretical ground.

For Jake and Neytiri, there's the question of how much of Jake survives in the body to which his mind and soul transferred with Eywa's help, and whether he shares that body with echoes of his dead twin brother, the scientist, for whom the body was created and who presumably spent several hundred hours driving it before his untimely death, and before Jake. There's also the question of whether Jake can rise to the moment when doing so means making use of his celebrity to lead the Na'vi into a time of changes they cannot avoid, one which will continue long after his death.

There's a lot of sequel material there. How much of it translates well to a film the expectations for which are preconditioned by what was largely an action movie set on another planet remains an open question.

Friday, December 17, 2010

why we haven't yet seen real social computing

Yes, people want to share discoveries and experiences with others, particularly with their friends, but not necessarily with their "friends" as defined by the social computing service du jour, and, in most cases, emphatically not with that service interjecting itself into the relationship.

A real social computing system would be more ubiquitous than the telephone network, and easier to use than the postal network. It would, at least in principle, include everyone on the planet in one way or another, even those living outside the reach of ground-based communications networks and on the economic fringe, unable to afford a phone much less a computer and a satellite datalink. It would be all about allowing people to connect with the other people with whom they wanted to connect, individually, in groups, and in context, as well as to avoid the whole range of threats and parasites. And, as much as possible, it would get out of the way and allow those connections to play out as naturally as possible.

The closest thing we have to this at the moment is internet mail, which, rather than being a proprietary service, relies upon the interoperability of thousands of services, based on a collection of standard protocols. For all of its inadequacies, email is the best available model.

That's not to say email should or even could serve as the basis for that social computing environment of the future, which is likely to require a fresh start. But as a standards-based experiment in interoperability, it can serve as a starting point for thinking about what might be required.

Wednesday, December 15, 2010

the impossible (but inevitable) takes a little longer

While I'm not familiar with the company or their technology, the acquisition of Caustic Graphics by Apple supplier Imagination Technologies seems like a very good thing, not least because Imagination Tech's own products integrate well with ARM processors.

My interest in computing really got started in 1983, just prior to the introduction of the Macintosh in early 1984, and during the mid-80s I took several CS classes at Colorado State University. During that time I attended at least one meeting of the student chapter of the ACM, the advisor of which had come to Colorado State from the University of Utah and was a graphics specialist. He showed the group a ray-traced cartoon, which looked very realistic except that the characters were obviously composited from simple geometric figures (spheres, cylinders, and cones), and the motion betrayed a lack of application of the physics of mass, gravity, force, and momentum. For that time it was impressive.

He briefly discussed the amount of computing resources invested in its creation, the details of which I don't recall, but I was left with the impression that each frame consumed hours of CPU time. I remember commenting to him that it would be a while before we were doing that sort of thing in real time, to which his first reaction was a blank stare, as though the idea hadn't even occurred to him, followed a moment later by pointed agreement.

Twenty-five years later, it looks like that time is approaching.

Sunday, December 12, 2010

265 hours of newly released Nixon tapes

Think of Richard Nixon as a smarter, gentler version of Glenn Beck. It almost works, and says more about Beck than it does about Nixon.

While 265 hours of newly released tape recordings from his administration might well make for some interesting listening, they probably won't break any new ground with regard to the character of the only President ever to be hounded out of office between elections, a fate a few others have deserved more than he did, and one that his party has attempted to serve to every Democratic president since.

Nixon's bad luck was that he won the 1968 election, instead of 1960, inheriting America's military involvement in Vietnam at its height, and a country divided over what to do about it. That he took personally the criticism that inevitably stemmed from this situation, demonizing his political opponents, reveals a weakness in what was essentially a strong character. He had other weaknesses, of course, but how many people do you know who could go through what he went through and emerge from it only moderately bitter. He was made of sterner-than-average stuff, perhaps not quite up to what we expect from our Presidents, but few are.

Saturday, December 11, 2010

tiered service suggestion for MobileMe

Charles Jade, writing in theAppleBlog, makes a suggestion complete with pricing and service specifics, for a 3-tiered version of MobileMe. In this post he makes one excellent point...

"By making MobileMe free, those using it with iOS devices won’t be using services from Google or Microsoft, which makes switching to Windows Phone 7 or Android more difficult. While PC users would also have MobileMe free, they’d need to have iOS devices to make it really worth using. The Halo Effect, which argues that iOS device sales later lead to Mac sales mitigates the loss associated with giving away MobileMe to PC users in the present. If they do switch, free MobileMe helps encourage them to remain all-Apple in the future. Free MobileMe would be an investment in hardware customer retention, and it doesn’t even have to be completely free."

Personally, I'd go one step further, integrating the bottom tier of MobileMe with the iTunes Store, customers of which already have unique Apple IDs. The online storage associated with MobileMe could then be used to secure purchases, say at a 1:10 ratio (1 GB allotment usage for 10 GB of purchases or rentals, which wouldn't actually have to be stored redundantly in your account), in case your machine encountered some disaster and you had no local backup. Premium versions of MobileMe, offering more storage, could cover proportionally more rentals or purchases.

Customers who currently have MobileMe and iTunes accounts associated with different IDs could have both (all) Apple IDs associated with a single merged account.

Given the fiscal need to tie revenue to products and services provided, Apple could lace the free version of MobileMe with iAds, purchasing them itself if necessary to insure that the expenses involved were offset by revenue.

I'd also go one step further with the premium (family/workgroup/professional) version of MobileMe, providing it with some of the capabilities of Mac OS X Server, like the ability to create a wiki or a shared calendar in the cloud, which could be accessed by others with any type of MobileMe account, also merging in the service. This tier should also have premium domain hosting and web authoring capabilities, like a professional version of iWeb.

Put enough value under one roof, at a price your customers perceive to be at least arguably a bargain, and there'll be many more customers than if they have difficulty justifying the purchase. This argument is far more compelling for services with relatively high up-front costs and low incremental costs than it is for hardware products with higher incremental costs.

Apple has maintained the price of MobileMe at a high enough level that they should be able to offer a complete, very sophisticated service without raising prices at all, and the cost of operating a basic subset of that service should by this time be low enough to be coverable through tasteful advertising alone, allowing it to be offered for free, with the competitive benefits that Charles Jade outlines above.

Friday, December 10, 2010

Apple's ripple effects, and their ripple effects

Bloomberg reports that Foxconn, Apple's largest manufacturing partner, has exceeded one million employees in China. That's approximately 1/10 of 1% of China's overall population, or one person in every thousand, who are being exposed, hands-on to Apple's high standards in design, materials, fabrication, and assembly, developing skills that are applicable elsewhere, and earning enough to be able to send a little home, save a bit, or make the occasional purchase of some non-essential product, like the iOS devices they're building. They're also participating in a massive exercise in logistics, as Foxconn scales up to handle the increasing demand for their primary customer's products.

Just as happened in Japan, which in the wake of World War II was the cheap, semi-skilled labor pool of the 1950s, Foxconn's workers are beginning to demand even better wages and working conditions. So long as Apple's competition is also going to China for their manufacturing needs, and accommodating the demands of workers can be accomplished without it working to the disadvantage of one player or another, workers can expect to see gradual improvement in both wages and working conditions. At some point, however, the temptation of lower labor costs elsewhere will surely result in the movement of some operations to Central or Southeast Asia, India, Africa, or South America. If that difference in costs translates to a market advantage, others will follow. No doubt Foxconn is very aware of this possibility and determined to remain competitive.

Foxconn's CEO has already stated that the company plans to eventually replace most assembly line workers with robots. Right now that's an expensive proposition, but if any company has both the means and the motivation to drive down the price of automation, it would be Foxconn. So, for some large percentage of those one million workers, their present jobs will last only so long as they don't price themselves out of the market, or until their places are taken by robots, whichever comes first. (Because the wages paid to Foxconn's workers are valued in proportion to the average income of a population 1000 times larger, those wages won't result in the same degree of inflation as occurred in Japan, so smaller wage increases can be expected, making the job loss to automation scenario more likely.)

That's, say, 800,000 at least semi-skilled workers, presumably with an acquired taste for quality, that will be released back into the Chinese labor pool, most likely gradually enough to be absorbed into other enterprises. Some of these will surely return to school to become engineers, while others will learn new trades and apply to them the uncompromising standards they're now learning by osmosis. The net effect will undoubtedly reach far beyond the wages they were paid while working for Foxconn, helping to boost China's fortunes generally and giving the country an even greater stake in maintaining stable relations with their neighbors and trading partners, improving the prospects for peace.

Meanwhile, those iOS devices they're producing will be helping to enliven minds around the world, with incalculable ripple effects.

Thursday, December 09, 2010

the error in Ryan's logic

In a article, the author, identified only as Ryan, concludes that doubling the dimensions of the iPad, as measured in pixels (quadrupling the total number of pixels) would not be enough to qualify the new device as having a "retina display" (one with a grain finer than the human retina can discern, such as the Retina Display (an Apple trademark) in the iPhone4 (another Apple trademark)). A cornerstone of his argument appears below...

"Now, the retina display was so named because Apple found that "there's a magic number right around 300 pixels per inch that... is the limit of the human retina to differentiate the pixels."* This assumes holding the device about a foot from your eyes, but I think most people tend hold their phone and their iPad at roughly the same distance (between 15 and 20 inches), it it's probably fair to assume that the iPad retina display should still be somewhere around 300 PPI.

* From Steve's WWDC 2010 keynote; skip to about 36:30 minutes for the retina display introduction."

That's some pretty strange reasoning.

First, I think he's wrong about most people holding phones 15 to 20 inches from their eyes, but even if he's right it's the 12-inch assumption associated with the 300 ppi (pixels-per-inch) figure that matters. If a device is held further away than 12 inches, it requires fewer than 300 ppi to saturate the retina of the human eye, at 24 inches only 150 ppi are needed to achieve the same effect, or 200 ppi at 18 inches.

At 2048 X 1536, a double-dimension iPad display is comfortably above 200 ppi, and even slightly above the 240 ppi that would be needed at 15 inches distance from the eyes, the lower limit of Ryan's own estimate of how far away people hold their iPads, so, assuming the guideline of 300 ppi at 12 inches is accurate, it would definitely qualify as a retina display.

I'm not predicting what Apple will do. The 1024 X 768 display in the first-generation iPad is already very crisp, and they just might decide to stick with it for another year, perhaps lowering prices a bit and concentrating on increasing frame rates in games, or they might increase the screen dimensions by a factor lower than 2. A 1.5 increase would mean 1536 X 1152, still qualifying as a retina display at 20-inches from the eyes, the upper limit of Ryan's range, while increasing the total number of pixels by only 2.25, a factor low enough that they could probably still manage a performance increase by going to a CPU using dual A9 cores paired with any of several GPUs. More importantly, they might be able to put all that together without a price increase that would drive many people to other platforms.

Jumping to a 2048 X 1536 display sounds risky, but if Apple's suppliers can build them fast enough, without significant delays to tweak the production process, at a low enough price, with a low enough failure rate, they just might go for it.

Tuesday, December 07, 2010

the misleading Mac vs. iOS dichotomy

Writing for MacWorld, John Gruber makes a case for the near-term persistence of the Mac, but casts doubt on its long-term relevance, and linking to the article from his own blog, the link text he chooses is "All Good Things Must Come to an End" and the article provides an implied subtext 'just not right now.'

Why so much gloom over the Mac's future when sales are through the roof? As Gruber says himself "The irony is that there’s more doubt today about the long-term prospects of the Mac than there has been at any time since Steve Jobs returned to Apple in 1997." The driving factor isn't Windows, of course, but iOS, Mac OS X's younger, more svelte sibling. People look at sales figures for the iPad, note it is already selling faster than the Mac, and start counting the days, weeks, months, or years until the Mac's demise.

Time for a reality check!

To begin with, iOS and Mac OS X are far more similar than different. The main difference between them is in the libraries supporting the user interface, AppKit on the Mac and UIKit on iOS devices. Below that level, they're practically identical, and becoming more nearly so with each release. (Parts which were originally left out of iOS due to resource limitations can be folded back in as more capable hardware becomes the norm, and some parts which originated in iOS are finding their way into Mac OS X.) The truth is, following their initial divergence to enable support of a touchscreen interface and limited hardware, and the fact they remain on separate tracks for the time being, in the long term they will probably reconverge. I'll come back to this point.

Not very long ago, "Mac" meant a machine with a keyboard, a pointing device (mouse or trackpad), an optical drive, a few ports, and a screen at least 13" from corner to corner, with a dual-core (or larger) Intel processor combined with a multi-core dedicated graphics processor, and running Mac OS X. The MacBook Air removed the optical drive from this definition, and more recently dropped the lower limit of the screen size to 11".

On the other hand, whereas the original iPhone required a bit of hacking before you could use an external keyboard with it, the iPad had a keyboard dock available in roughly the same time frame as its own release. Obviously, Apple recognizes an on-screen keyboard isn't an acceptable substitute for a physical keyboard for many purposes. And if they haven't yet made it possible to use a mouse or trackpad with an iPad, they certainly could. If they're holding off, it's probably because they're working on a comprehensive solution for the combination of two subtly different user interaction paradigms.

This is how I see iOS and Mac OS X converging, through each gaining the ability to support the other's UI paradigm. Just as you now see keyboards connected to iPads, you might also see touchscreens connected to Macs, something which has actually been possible for awhile, thanks to Wacom, and I believe there are also apps which enable the connection of an iPad as a touchscreen peripheral, although they probably pair with specific Mac apps, rather than providing general touchscreen utility.

In very simple terms, this means building a version of iOS including AppKit, and a version of Mac OS X including UIKit. The reality is no doubt a good deal more complicated, but that's the nutshell version.

For the developer, the path to taking maximum advantage of this is through distributing app components across the whole range of devices, with each device running the components that make sense for it, given its intrinsic capabilities. For Apple this presents a choice between leaving developers to work this out for themselves in a hundred different ways or to provide a framework which makes it straightforward. I can't imagine Apple wouldn't choose the latter.

So, if the Mac disappears at all, it will be disappearing into something larger and even more powerful, pieces of which will fit in your pocket, or on your wrist, but most likely there will continue to be machines called Macs, using a keyboard and pointing device as their default paradigm, until the market for such machines shrivels up, by which time most of us will have ceased to care, else there would still be a market.

Saturday, December 04, 2010

Have I mentioned the Robots Podcast here yet? Oh, probably, but it really does bear repetition.

Between the Robots Podcast and its predecessor, Talking Robots (scroll down), well over a hundred episodes have been archived, each of which features at least one interview with someone deeply involved in robotics or and/or a related field (animal behavior, for instance), with occasionaldetours into cultural responses to the advent of adaptive machines. Actively listening to the entire collection must be equivalent to a graduate level survey course in robotics.

Each episode has its own web page, and the links on those pages, taken together, read like a compendium of top-flight robotics programs and companies, if not comprehensive then a very good start on being so. It's a great way to get a quick overview of who is doing what, where.

Check it out:

will Apple sandbox Mac apps?

You know, most Window's users would hardly notice if they weren't able to use software other than Office to manipulate their Office-generated documents. They already behave very much as though Windows were a sandboxed environment, with one big sandbox and a handful of smaller ones.

That's a far cry from the experience of most Mac users. Aside from iTunes, which must be used to connect to Apple's online content and app stores, to a lesser extent Safari, which is the best browser for use with all Apple websites, including MobileMe, and Xcode for programming for the Mac and iOS, there's no one Mac application or suite of applications that dominates any use category. For whatever you might want to do, there's a choice, and it's common for Mac users to first use one software tool, then another, then another, in a workflow that makes use of the best characteristics of several programs.

This is harder to do in a sandboxed environment, like iOS was originally and still is, except as developers take advantage of provisions for file sharing between applications.

Would Apple similarly cramp multi-app workflows on the Mac?

There's probably two answers to this question, "no" and "yes".

No, we're not going to find that some major update to Mac OS X comes at the price of the inability to save a file with one app and open it in another. Not tomorrow, probably not ever.

On the other hand, Apple probably will find a way to provide some of the security that iOS gains from sandboxing, without actually imposing sandboxes, for the most part.

One way in which they've already done this is their use of property list files, which use a small set of basic object types to wrap data, and make it extremely unlikely that data will be run as code by accident, or by any program other than the one that saved it. Property list support is ubiquitous in Apple's frameworks, and they're very simple for the application programmer to use.

Something else Apple might do is to verify that saved files conform to the type they claim to be, that a JPEG is actually organized as a JPEG and not concealing something that doesn't belong. Developers that provide executable definitions for their custom file types might be given more elbow room than those who don't go to the trouble, and developers who don't go to the trouble might be presented with a choice between using standard file types and property lists exclusively or having their applications sandboxed, prompting some to cry "foul" and others to characterize them as whiners for doing so. Does the operating system have a right to know, in general terms, the content of every file? Of course it does; end of subject.

That's a plausible scenario for how Mac OS X might evolve in the wake of the advent of iOS, far more plausible than the scarecrow that Mac owners might look up from some software update to find their machines locked down. Sure, some apps might be sandboxed, until and unless their developers get with the program, but not the system as a whole, and, most likely, not anything distributed by a reputable company, for which the user paid real money; such apps would already have been updated by the time the deadline arrives.

So quit worrying and enjoy the ride, and think twice before using a custom file type that you aren't prepared to nail down with a schema, or something similar.

Friday, November 19, 2010

TBL warns Web fragmenting into walled gardens

While I've yet to read more than the first page of Tim Berners-Lee's latest, I've already stumbled across several statements that merit response, the following in particular.

TBL writes "The Web evolved into a powerful, ubiquitous tool because it was built on egalitarian principles and because thousands of individuals, universities and companies have worked, both independently and together as part of the World Wide Web Consortium, to expand its capabilities based on those principles."

Certainly, in part, but the potential for profit played no small part in motivating most of those involved, who shared that egalitarian vision only insofar as it served their own proprietary purposes. For example, contending browser vendors attempted to push through their own extensions to the initial standards, with an eye to reaping licensing fees for their use.

It's testimony to the purity of TBL's own motivation that he can still see the development of the Web in such terms, and apparently believe that it is only recently succumbing to the taint of divisive commercial interests, but, as well informed as his view undeniably is, it fails the test of objectivity.

There was never any chance that the Web could grow as it has while existing in isolation from such influences. The best possible outcome would be if the Web continues to make provision for the unencumbered exchange of information and opinion, alongside the walled gardens and pay-to-play sites, far into the future.

Monday, November 15, 2010

something BIG

When Steve Jobs suggests that Apple's $50+ Billion in liquid assets was being saved for the opportunity to acquire something big, we shouldn't confine our thinking to the obvious, companies with market caps smaller than Apple's mountain of cash.

Take, for instance HP. If things continue as they are for awhile longer, Apple's cash reserves will surpass HP's market cap in a few years, but even before that it might be possible to execute an acquisition that included an issue of new stock.

So, why would Apple be the least bit interested in buying HP, nostalgia aside?

HP is among the more nimble of the well established technology companies, and has a reputation for quality that suggests its corporate culture might at least be compatible with Apple's. It has a more diverse product line than does Apple, with distribution channels to match, and has a huge collection of intellectual property, including that recently acquired along with Palm.

But HP sells computers that run Windows! What good would that be to Apple?

To begin with, Apple could make sure those computers all shipped with Safari, iTunes, and MobileMe Control Panel preloaded, and include the same trial MobileMe account that they do for Mac buyers. Apple has plenty of experience with supporting Windows, so this much wouldn't even be a challenge.

Beyond that, Apple could alter the circuit board designs to make them Mac-compatible, and offer them with Windows, Mac OS X, or both, picking up some additional after-market business from people who bought Windows-only machines and later had regrets. People who were leaning towards getting a PC but tempted by Macs would flock to HP in droves, taking business away from its PC competitors.

I'm not predicting that Apple will acquire HP, only pointing out that a case for doing so might be made, and suggesting that this illustrates how broad a net is needed to gather in all of the possibilities for what Apple might do with its hoard.

Friday, November 05, 2010

of rack-mount systems and mysterious data centers

Standard 19-inch and 23-inch rack-mount systems exude the sort of geekiness found in serious IT departments and data-services operations. Is it even conceivable that their days might be numbered?

Apple's choice to discontinue their Xserve line just as their own huge data center in North Carolina is about to come online might seem to suggest they think so, but chances are that new data center will be chock-full of rack-mount hardware, just mainly not Xserves.

I expect a significant contributing factor in Apple's decision to discontinue the Xserve is that they found it wasn't a competitive option for them in equipping the new Maiden, NC facility, even considering that they could sell it to themselves for something like 30% below retail.

Perhaps also, and this is pure speculation, given their investment in miniaturization, a 19-inch rack-mount system is simply too inefficient in its use of space. That might not seem relevant when the size of your data center is measured in acres, but once it fills up that space becomes precious, and configurations that waste it won't survive long. Perhaps they plan to switch to a narrower (10-inch ?) rack-mount system that can fit nearly twice as many devices into the same space. Given the size of the facility, they can probably design and build something for themselves more economically than they can buy from another supplier, especially considering the possibility of using A4 chips or similar ARM-based SOCs, paired with commodity hard drives, and running some variant of Darwin, iOS, or Mac OS X, and doing so would provide them with valuable experience, helping them gain traction with the enterprise in the future.

On the other hand, their need for such a data center may have developed too quickly for them to rely upon technology developed in-house, at least at the outset.

For most of us, the nature of the services that data center will support is more important and far more interesting, but those of us with a geekstreak will continue to wonder over the technology in use until such questions are answered.

getting a handle on Flash

Cult of Mac has published instructions for uninstalling Flash, together with instructions for reinstalling it if you decide that's what you want, and a pointer to ClickToFlash, which replaces Flash content with an inert rectangle bearing the word "Flash", unless you click on that rectangle, in which case the Flash content is displayed.

And here's John Gruber of Daring Fireball on essentially the same topic.

Friday, October 29, 2010

code as content; code as a vector of change

First came symbolic speech. Thousands of languages and dialects blossomed, and the words of some, the shamans, those believed to have direct experience of a spirit world, seemed to possess magical potential.

Then came writing. The great variety of spoken language began to be replaced by preservation approaching permanence, joined shortly by the rigor of peer review and explicit criticism, and the magical potential of the words of the shamans, became invested instead in interpreters, the priests and scribes.

Perhaps presaging what was to follow, a variant of writing, plays, developed into instructions to be performed by acting companies.

Then came machine code, a variant of writing that controls the operation of hardware designed to process such instructions. Very early on that machine code gained conditional branching, the ability to perform different sets of instructions based on the value of some numerical/logical expression. At that point it must have already been apparent to a few that something like machine code would eventually surpass conventional writing, by virtue of its potential to directly control the actions performed by machines. At about the same time, a process of increasing abstraction began, whereby machine code was wrapped in assembler code, which was itself wrapped in higher languages, more closely resembling conventional writing.

Thus far, the impact of computer code on what these days passes for natural language (speech having already been molded by thousands of years of close association with writing) has been to facilitate its production, dissemination, and consumption, but that's only a small part of the whole story.

Computer code, embedded in machinery, has the potential to render meaning tangibly, as real physical performance, with real consequences, good or bad. Given that the design and production of machinery is itself becoming increasingly automated, the question becomes one of what you want the machines to do, and not do, and how those desires can be represented in code.

For example, I can say that I want to preserve what remains of Earth's original biological diversity, while at the same time reducing the dependence of agriculture on petroleum, but in that form it is merely a feeble wish, displaced by the next thought. I can write a treatise explaining why we ought to do what we can to preserve what remains of Earth's original biological diversity and free agriculture from dependence on petroleum, and it may stir others momentarily, but it takes more than that to make any real difference, and if I were to be asked exactly what I'm talking about in practical terms my answer isn't likely to satisfy those whose livelihoods would be effected by any such initiative.

If, on the other hand, I express my intention in the language of machine design and control logic (computer code), the implications, not only of the basic design but of various approaches to managing the system, can be explored through simulation, and that expression goes a long way towards constituting a detailed plan for its own implementation. It's all just code, even the design for the physical machinery, but in a form that makes the decision to go forward with it almost as easy as the decision to flip a switch, at least as compared with a vague call for the desired end results.

Such a project would be too big for any individual, of course, so tools that facilitate collaboration on such projects are needed. Some such tools already exist; others remain to be invented, and much effort is being expended in this direction, even if those involved don't see their work in such grand terms, with the potential to achieve change that could never be achieved through conventional political means alone.

Thursday, October 21, 2010

the parts of Lion which remain secret

I must confess some disappointment over yesterday's announcement, not because the stuff they showed wasn't cool - it was! - but because the stuff that interests me the most remains secret. What we saw of Lion were mainly user interface enhancements, a category that was famously, intentionally missing from Snow Leopard, which concentrated on bug fixes and lower level enhancements. It might even be that the pause in the introduction of new user interface features, represented by Snow Leopard, made possible the integration of such features seen in Lion's new Mission Control view.

But what's Lion got to compare with OpenCL and Grand Central Dispatch? Something, most likely, but that something remains secret. Does it have a new file system or file system abstraction layer? Does it bring true resolution independence? Are any technologies developed for Apple application software, like iPhoto's face recognition, moving into system code and becoming available to third party applications? Are there any important new APIs?

Patience, I try to tell myself, which of the major releases of Mac OS X has not brought such advancements? Maybe 10.1, which was primarily a bug fix and code efficiency update to 10.0. Are they out of ideas? Surely not! Are they underfunding lower level R&D? Not likely. What then?

A Mac OS X engineering position announcement awhile back stated quite plainly that a successful applicant could wind up working on something unprecedented and revolutionary. Sure, Apple spreads such verbiage a little thick at times, but the clear implication was there's something big happening in the wings, and the timing was such that it's likely to be included in Lion, which won't be released until next summer.

Moreover, in yesterday's collection of announcements, everyone on stage was careful to point out that only some of Lion's features were being shown. Clearly there's something else, something they aren't yet ready to talk about.

Something else Steve was careful about, in discussing touch input on Macs and how they've concluded that it just doesn't work on a vertical screen, was that he was talking about laptops, not necessarily all Macs. So, perhaps this recently revealed Apple patent is more than a design exercise. Time will tell.

Wednesday, October 13, 2010

Nice kitty!

The invitation for an Apple Event taking place one week from today strongly suggests that the primary topic of the day will be the next major version of Mac OS X, 10.7, and that the cat-name for this version will be "Lion". Cheetah, Puma, Jaguar (the first to be marketed as such), Panther, Tiger, Leopard, Snow Leopard, and now Lion.

Recent versions have brought, in no particular order, the Kernel Extension APIs, the Acceleration Framework, Spotlight, Core Animation, Core Data, OpenCL, Grand Central Dispatch, and Objective-C 2. What might 10.7 have in store to match these?

Something, no doubt. ;-)

Friday, September 17, 2010

China meets the iPad

China isn't just another market, much as the iPad isn't just another gadget.

China is the most populous country on Earth and has one of the fastest growing economies. So much for the obvious.

While China's economy is rocketing upwards at an annual rate of eight percent, the spending power of most individual citizens is still modest, which makes the affordable iPad a good match.

What makes it an even better match is the touchscreen input, which doesn't discriminate against character-based input, providing the iPad an even greater advantage over keyboard-based netbooks in China than elsewhere.

The advantage the iPad has over other touchscreen tablets, running other operating systems, is more subtle but still substantial, deriving from factors such as attention to detail in both hardware and software, a fast-maturing application software market, and the goodwill Apple has gained through its efforts to improve conditions for the hundreds of thousands of Chinese workers involved in manufacturing its products (even though it turns out those conditions weren't so bad in the first place, aside from the economic incentives for workers to put in exorbitant amounts of overtime, or to end their lives for the compensation their families would receive as a result).

The iPad is destined to be a particularly huge hit in China, but the whole world will benefit from this.

Tuesday, August 31, 2010

AutoCAD returns to the Mac, and comes to iOS in the bargain

As before, I merely pass along a pointer to Architosh's report, which discusses the iOS companion program as well as AutoCAD itself.

Architosh also reports on what appears to be a 60-second TV spot, produced (or at least paid for) by AutoDesk, the company behind AutoCAD.

Monday, August 30, 2010

perseverance and pride

I've already said most of what there is to say about the event itself here, but what I only mentioned briefly there is that it's been a long, long time coming, with the project going through metamorphosis again and again. I had a need to do something that would contribute to bringing the world of music back around to an appreciation of the role of harmonics in melody, not just in harmony, or at least that's how it began. Arguably, I've accomplished that.

Along the way it also became an issue of pride in craftsmanship, as an amateur programmer with experience in Apple's approach to supporting application software. That and avarice may collaborate to drive the project further, perhaps much further, but for now I have a sense of contingent satisfaction, the degree of satisfaction being contingent on the degree of traction my creation manages to garner.

Of course, a free app with no ad revenue doesn't do much to feed avarice, so I'm left with pride in craftsmanship, for now, which is probably just as well; it's what drives me to do my best work.

Thursday, August 26, 2010

party politics in a nutshell

I'm registered to vote, but not affiliated with any political party, at the moment.

That said, here's what remains of my impressions of the two main political parties.

The Democrats have a huge investment in appearing high-minded, and consequently manage to be right about what should happen a little more often.

The Republicans have two distinct constituencies, for one of which they must appear to be at least adequately morally upright, and for the other of which they must make it clear that morality doesn't extend to the wholesale confiscation and repurposing of wealth. It's a delicate dance, and, as a consequence, they manage to be right about what will happen a little more often.

Both are people, with all that means, good and bad.

Tuesday, August 17, 2010

how to improve upon Mac OS X 10.6?

Update: John Siracusa weighs in

From my perspective, that looks like a tall order, but so far it's looking like a slow news week, and Cristopher Ryan, writing in the Apple Blog, has asked What Could Make OS X 10.7 Great? so I'll have a go.

Without really understanding the issues involved, let me echo Ryan's call for a new file system. Everyone who followed such developments seemed encouraged by the hope that ZFS was just around the corner, when it appeared to be so, and then Oracle bought Sun Microsystems and suddenly we were back in bed with HFS+, with no realistic alternatives on the horizon. Meanwhile, whatever Apple is using for a file system in iOS devices is fairly free to evolve, since iOS apps are sandboxed and only have access to the narrow slice of that system that relates directly to them. Also, being flash memory devices with no hard drives, they have different abilities and requirements as compared with a desktop machine. One possibility for the future is that iOS's refined handling of flash memory might be grafted into HFS+ at about the same time that Macs other than the MacBook Air get a bank of flash memory or a solid state drive to augment their massive (but slower) hard drives. Granted, this probably won't be enough to satisfy those who really understand file system issues.

Another iOS capability that might be brought to Mac OS X is the full-blown touch interface, not just the trackpad gestures supported by the Magic Trackpad (which I love!). (If you think the possibility of touchscreen Macs isn't even on Apple's radar, check out this and this.)

Something not yet implemented in iOS, that I hope to see sooner rather than later is low-level support for machine vision, stereo machine vision to be precise, meaning dual video cameras atop MacBook screens and Cinema Displays, or, perhaps better, a dual camera accessory, also sporting stereo microphones, with motorized pan, tilt, and zoom, as well as automatic focus and aperture, benefitting from the same attention to detail as the original iSight. It would be enough if it could do good quality stereo video recording, editable in iMovie, to begin with, leaving any machine vision applications for the following year.

Mostly, I'd like Apple to continue with the transformative process that was the main selling point of 10.6 over 10.5, making the next version even more coherent, robust, and svelte.

Sunday, August 08, 2010

whichever comes first

An Engadget article on the relationship between Apple and AT&T cites a legal action as confirming "the handset was originally locked to AT&T / Cingular for a full five years." In reading that it occurred to me that there might have been another factor written into the original contract, some number of iPhone activations, that would automatically shorten the period of exclusivity if it were to be reached before the full term expired, without any renegotiation of the contract.

Given that iPhones have out-sold everyone's expectations, such a clause could be expected to fire, but the exact timing would remain indefinite until just a few days remained.

Thursday, July 29, 2010

new version of Safari feels faster

I've got nothing to compare them with, but here's my SunSpider JavaScript results (late '08, 2.4 GHz MacBook, running Mac OS X 10.6.4).

Saturday, July 24, 2010

breaking Windows dependency, a stepwise approach

Apple is beginning to invest in outreach to small businesses, but for this initiative to enjoy maximum success they'll need to understand the constraints that keep businesses dependent on Microsoft's inferior Windows platform.

Mainly it's a matter of time, or the lack of it; Windows keeps IT personnel too busy to investigate alternatives. Ironically, some IT departments still have a Windows-only rule, to avoid having to spend the extra time to learn to support other platforms, even though doing so might pay off in the long run, through reduced need for support.

Of course, time is just another word for money, and money is perpetually oversubscribed in any small business, the upshot being that it takes an iron will to consider total cost of ownership above initial price. Support somehow becomes a separate issue, subsumed under the necessity of having an IT department at all, and is effectively rendered a non-issue by (mainly voiceless) repetition of "that's what we pay them for."

So how to break into this circle of nonoptionality?

The first step probably has to be taken by the company (or IT department) itself, rescinding their Windows-only policy in principle. Until that happens, the best arguments in the world fall on deaf ears.

Once they've taken that step, other platforms can compete for their business on the merits, and once that's true new opportunities appear, as if by magic. Many of those opportunities will take the form of workstations that can easily be replaced by something else (a Mac mini, for instance), either because their users have no need for specialized software or because what they do need happens to be written in Java or some scripting language that will run just as happily on other platforms, or it runs on a web server and they can access it through any modern browser.

Those are the easiest sales, but there's another category that doesn't involve substitution but rather the insertion of a new layer, workgroup servers, into the company's network. I can say this with confidence, because Windows Server is so expensive only the most successful small companies will have seriously considered providing a dedicated machine running Windows Server for each workgroup. Others might be aware of the potential benefits that groupware running on workgroup servers can bring, but for most the cost of Windows Server will have proven prohibitive. Not so with other platforms. Apple's Mac mini server, for example, costs $1000, no matter how many client machines you connect to it, and it comes with the basic categories of groupware already installed.

Another selling point (for Macs) that shouldn't be overlooked is the relative ease of development of custom applications, presenting the possibility of crafting what you really need, whatever it might be, in-house, rather than making do with what you can find elsewhere. Microsoft has made the creation of simple databases and automated spreadsheets fairly easy, but beyond that you're pretty much on your own. Anyone with the skill to create an automated spreadsheet of more than trivial complexity can learn to build practically anything on the Mac, and well written Objective-C is self-documenting, so if they leave someone else should be able to pick up where they left off.

Of course, for the really tough cases, Macs run Windows just fine.

Thursday, July 08, 2010

a notion regarding Apple TV

A strange thought just crossed my mind. What if Apple, rather than combining the AppleTV with the the Mac mini, were to instead combine it with the Time Capsule? The result would be an AirPort base station, plus a largish, reliable hard drive, plus video hardware and ports to allow wired connection to your TV or home theater setup. One product, three complementary uses. Make that four complementary uses, as such a device would surely also run iOS Apps, including games.

The hard drive could automatically be divided between maintaining a media library and preserving Time Machine backups. The AirPort functionality would work equally well to stream media to another device, or to allow you to connect controller hardware for games, or enabling your laptop or desktop to send incremental TM backups, or simply as a means of connecting to the internet.

Really, if your starting point is the AppleTV, all that's needed is a faster CPU and GPU, something adequate for games, a larger hard drive, and the software in a Time Capsule. The A4 chip is probably an adequate CPU, although a dual-core variant or something based on the next generation ARM design might be preferable given the need to drive a 1920 X 1080 display, but these parts shouldn't be prohibitively expensive.


Sunday, June 27, 2010

a quiet suggestion regarding the rebranding of Mac OS X

As viewed from the perspective of iOS, what is Mac OS X, really?

Is it the supercharged, full-featured version? That was more the case before the introduction of iOS4 than now, following that milestone. There is probably still an argument to be made for this view, but not as strong an argument as before. The biggest remaining difference is that Mac OS X handles a multi-window environment and full-blown multitasking. On the other hand, it doesn't yet handle touch events out of the box.

Is it the parent, with legacy issues? Parent, maybe; legacy issues, not so much. Most legacy issues in Mac OS X have already been dealt with. The biggest remaining one is probably the file system, HFS+, which dates back to later versions of the original Mac OS.

Is it the older, bigger brother? As a metaphor, this one actually works pretty well. Mac OS X has been around longer, is quite a bit more massive, and optimized for completeness as well as for performance, as measured both from the interface-response perspective of the user and with regard to battery life.

Being optimized for completeness means, among other things, that it supports a wide variety of hardware. A single installation disk can be used to install Snow Leopard on any of dozens of Mac models, with varying CPUs, GPUs, support chips, monitor resolutions, etc.

It also means that new features that aren't tied to a touch interface, or to one of the special capabilities built into the iPhone and other members of the iOS family, are likely to appear in Mac OS X first, especially if they involve a large amount of code/resources, or place a significant burden on the hardware.

So you could think of Mac OS X as being the extended version of iOS, or, given that it runs on more powerful harder, maybe the extreme version.

To come back around to the question of rebranding, "iOS Portable" (or Mobile), "iOS Desktop", and "iOS Server" works very nicely, except that it might begin to be confusing if you have laptop computers running "Desktop" and desktop computers or set-top boxes or tablets too large to carry around running "Portable" or "Mobile".

On the other hand, "iOSx", with a small 'x', no longer a Roman numeral, would also work, unusually quietly as such things go. Combined with separate versioning (iOS 4.x being contemporary with iOSx 10.6.x), it sufficiently distinguishes the products, while making it very clear that they are fundamentally variations on the same basic design.

Sunday, June 06, 2010

a truck is in the eye of the beholder

Update: Jason Snell, writing on addresses the angst of Mac users regarding its possible obsolescence. In a nutshell, he says not soon.

I just want to say that, here in Boulder, trucks, mainly SUVs, are rather common.

That said, I'd like to point to Patently Apple's piece on The Next OS Revolution, and leave you to draw your own conclusions.

Monday, May 24, 2010

Thursday, May 13, 2010

Adobe's founders fume mindlessly in public

The biggest problem with the open letter released by Adobe's founders is there's no there there, no substantive argument.

They warn against the web fragmenting into closed systems, but in fact the exact opposite is happening, the web is unifying into an open system with a common language; it's reaching maturity.

One Adobe creation, the Portable Document Format will constitute an important part of the new web reality, in no small part because it is no longer proprietary. When Adobe turned PDF over to the ISO, all objections against its use disappeared. Had they done the same thing with Flash, the result would likely have been the same, and Flash would now be joining HTML5, the DOM, CSS, and Javascript as part of the emerging standard. In that alternate reality, even Apple would likely be happy to support it because they would be writing their own implementation to their own high standards, and Adobe, having had a huge head start, could continue to dominate the market for tools to create Flash content.

Instead Adobe chose to keep Flash a proprietary format. They should not be surprised that others' reactions run from lukewarm acceptance to outright rejection.

That Flash can be made to run acceptably on mobile devices, and that some web developers would prefer to keep using it rather than learn to use the open standards, doesn't even remotely constitute evidence that the future of the web is at risk unless every platform vendor were to allow it on their systems. Flash is common, yes, but its percentage of installation has already peaked and is beginning to dwindle. It is not and now will never be part of the standard, even if Adobe were to finally see the light and turn it over to the ISO or W3C; it's too late for that.

Monday, May 10, 2010

Simon Sinek at TED

This speech was delivered last year at a TED conference in Washington state.

Click here for full frame.

Tuesday, May 04, 2010

using physical brushes with touchscreens

Bill Gates says pen-based tablets will beat the iPad, at least with students.

Something like a decade ago, one of my favorite rant topics was ‘where oh where is the electronic brush?’ I was referring not to brush gadgets such as are common in drawing programs, but to a physical brush that could be used in conjunction with an electronic display. At the time I imagined a brush composed of optical fiber, along with some internal electronics to detect the hotspot of a CRT as it passed beneath the bristles, and more electronics in the computer itself to correlate the signal generated by the brush with a position on the screen.

Well, CRTs are hard to find these days, and LCD screens don't have hotspots. On the other hand, touchscreens have become quite common, and at least those used by Apple are good enough to use for drawing.

The trick would be to find a bristle material, or the combination of a bristle material and the internal design of the brush, that would sufficiently mimic the capacitance of a finger to be detected as such. You can use as fine a brush as the touchscreen will reliably detect.

The difference between such a brush and a pen-input system might appear negligible, to the casual observer, but would be far more pronounced to the person actually using the device, due to the difference between the gradual contact of a brush and the sudden contact of a pen. Moreover, a brush would provide pressure information directly to the device, via the touchscreen, without need for a Bluetooth connection, and clues to nuanced movement via the rolling or rotation of the triangular area contacted by the tip, so you'd be able to use it to move sliders or rotate dial gadgets using very small motions of your fingers.

Also consider that the written language used throughout east asia is traditionally drawn with a brush, and is still more legible done by brush than with with a pen.

Besides which, there's an elegance to brushes that no pen can match, and the main reason for using a hard pen, multipart forms, simply isn't a consideration on a touchscreen.


Click here for the full-frame version.

Sunday, May 02, 2010

not quite the Holy Grail, but...

There is now some reason beyond vain hope to think that AutoCAD will be coming to Mac OS X!

Friday, April 16, 2010

the marketplace of unmitigated, hateful nonsense

Update: It appears Apple may be prepared to reconsider Mr. Fiore's application and risk opening a can of worms.

It must have been a tough call, for whichever of the people Apple employees, to review iPhone/iPt/iPad apps that are submitted to the App Store, who was so unfortunate as to draw Mark Fiore's app. As an Apple employee, there's a pretty good chance they find a lot to agree with in the editorial content of Fiore's work, and would have liked to approve it, were it not clearly in violation of the section of the iPhone Developer Program License Agreement covering objectionable material.

The point here is the precedent. If they were to allow Mr. Fiore's work into the store, on what basis could they refuse to allow Glenn Beck, or any of a hundred (thousand?) others of a similar stripe, to flaunt their drivel on the App Store?

I can't bring myself to blame Apple for not wanting to open up the App Store to such nonsense, nor for deciding that, in the interest of keeping it out, Mr. Fiore's work also had to be excluded.

(see the Dunning-Kruger effect, via Daring Fireball)

Wednesday, April 14, 2010

Dilger scores

Despite his characteristically over-the-top manner, Daniel Eran Dilger does occasionally contribute a point or two that apparently hadn't occurred to others. Such is the case with his five myths piece on the exclusion of Flash from the iPhone.

Tuesday, April 13, 2010

another perspective on the most recent flap

I haven't yet seen the license agreement for iPhone OS 4, because, in my own way, which means on my own time and in fits and starts, I'm working on something for the iPad, which won't be upgraded to iPhone OS 4 until fall. I have some hope of having something ready to ship before then, so I'm still on 3.2. Besides, my phone is an original iPhone, and won't be able to run 4.0, so even if I were working on an iPhone app I'd have to get something new to have a test device, and buying the iPad has already used up that budget for awhile.

But even when the time comes for me to switch to the iPhone OS 4.0 SDK, it won't matter one whit that Apple has precluded the use of middleware platforms, since I've quite enough on my hands learning what I need to know to make use of their own frameworks and have no intention of further complicating the task by bringing in someone else's.

Sure, I can see where someone who's been using Flash right along, and had been hoping to avoid the Cocoa Touch learning curve, might feel like they've been slapped down, but Apple's position on Flash has been plain as day for three years, or was until Adobe began muddying the waters by promising Flash developers that they would be able to compile to native iPhone OS apps using CS5. And it should be pointed out that Apple has already allowed some apps built with beta versions of CS5 into the App Store, for use on devices running pre-4.0 versions of iPhone OS, which right now is every iPhone OS device that isn't provisioned for iPhone OS 4.0 development, which includes all iPads. It's also possible that enterprises doing their own app distribution (outside of the App Store) will still be able to make use of this feature of CS5 even with 4.0, at their own risk of course. Apple has drawn a line in the sand, beginning with iPhone OS 4.0 apps submitted to the App Store.

The wiser pundits seem to agree that it's not so much about Flash as about allowing anything to come between developers and the platform Apple has so carefully constructed, and which they are using to push the state of the art. Any third-party middleware layer that became popular could (likely would) impose constraints on further development of the underlying native OS, if only by being slow to move off of deprecated APIs and to adopt new features.

It seems to me that there are two workable approaches to this, either you take the position that Apple has now taken for iPhone OS, beginning with version 4.0, or else you allow any such middleware platform, relying on the competition among them to force them to be good citizens, and using the native OS to set a high standard, which is the position Apple has taken for Mac OS X. Take Java for example. Java has never been allowed on the iPhone, but on Mac OS X Apple writes their own Java runtime. If they had access to the specifications that would allow them to do this for Flash also, they probably would.

This is probably the direction that Android is headed, but without the unitary native platform, since each manufacturer has their own approach, and Google hasn't yet shown much talent for herding cats. Another three years hence Apple will have an even stronger, unified platform, but where will Android be, lost under a pile of middleware platforms?

Sunday, April 11, 2010

recent discovery: the Robots Podcast

As someone who at least imagines himself to have a reputation for being a robotics enthusiast, you might expect me to be right on top of what are currently the best sources of open information in the field, but you'd be wrong.

My enthusiasm is primarily for service robots, machines that do tasks people find uncomfortable, boring, demeaning, dirty, dangerous, or insufficiently valuable, such that you cannot find people willing to do them for what you can afford to pay in any but the most starkly depressed economies.

This is a category that hasn't received much attention in recent years, with most press/blog coverage going to robots that physically mimic humans to varying degrees, and most hobbyist activity directed towards battlebots.

So perhaps it's understandable that something as excellent as the Robots Podcast could escape my awareness for nearly two years. In that time this biweekly podcast has accumulated a very impressive collection of interviews with some of the most brilliant people working in robotics and closely related fields.

It's available both via RSS and on iTunes. Do check it out!

Sunday, April 04, 2010

a western understanding of chi/qi/ki

Physical science, as it developed in the west, revolves around what can be compounded out of a few basic units, primarily mass, time, and distance. Velocity, for example, is simply distance divided by time (meters/second), and momentum is velocity multiplied by mass (kilograms*meters/second).

This approach became far more powerful with the advent of calculus, which addressed rates of change (differentiation) and accumulation (integration). Differentiation and integration are complimentary concepts.

It was through his invention of calculus that Newton was able to arrive at a theory of gravity (that the attraction between two objects varies proportionally to the inverse of the square of the distance between them, 1/r-squared), and combine that with momentum to determine that the planets trace out ellipses as they orbit the sun.

If you have a mathematical expression describing the position (in a single dimension) of an object over time, then velocity is the change in that position over time, and acceleration is the change in velocity over time. Another way of saying this is that velocity is the first derivative of position, and acceleration is the second. Actually computing the first derivative of the original expression will give you a new one which describes the instantaneous velocity of the object at any moment, and computing the first derivative of that will produce one that describes the instantaneous rate of change in that velocity at any moment, or the acceleration.

This first derivative of acceleration, which is to say the expression describing the instantaneous rate of change in acceleration, is variously referred to as jerk, jolt, surge, and lurch.

At this point I'm going to switch from physics to biology, to suggest that jerk is closely related to the effect produced by the firing of neurons to activate skeletal muscles. A single impulse produces a spasm, but little actual movement, whereas an escalating stream of impulses produces a progressive tightening of the muscle, accelerating whatever movement the muscle generates.

But the bodies of higher organisms can't operate through the activation of a single muscle. They make use of cyclical patterns involving many muscles, each of which will alternately contract and relax as it plays its part in the pattern. These patterns are imprinted within and coordinated by the cerebellum, which presents a simpler interface to the rest of the brain.

When you want to raise your arm, you just raise your arm, without having to think about which muscles are pulling on their tendon connections to your skeleton to accomplish this. When you want to run, you may think "left-right-left-right" and/or "faster, faster", but again you're not having to consciously juggle the hundreds or thousands of impulses per second that initiate and sustain the pattern; that's all being done automatically for you by your cerebellum.

The first time you do something new, you're likely to do it very slowly and deliberately, because your cerebellum can only estimate what pattern will work for the new action and your conscious mind is more directly involved to make sure that it stays on track. As the cerebellum gains experience, the conscious mind can safely leave the details to it and simply choose to perform or not perform the action, as well as when, in what direction, and how vigorously.

If you collect a sufficiently complete repertoire of available actions, that your cerebellum knows how to perform, you may find that you are able to string them together in novel combinations, and even to create new actions on the fly, to tie those sequences together. This is partly a matter of the higher brain coming to trust the ‘black box’ of the cerebellum, and to understand how to guide it.

At this point, within the range of the repertoire, the question ceases to be what can you do and becomes what will you do? How will you use it? And your response to that question is your intention.

To get back to the point, I see chi as being the degree of alignment between that intention and the pattern of neural activations the cerebellum produces. Someone whose ‘chi is very strong’ has a high degree of alignment, which is to say that what they intend and what they do are one and the same.

While the concept ‘chi’ is tightly interwoven with physical movement, similar degrees of alignment pertain to the use of speech and the cultivation of emotion. These might be termed truthfulness and respect.

Not the conclusion you were expecting?

Saturday, April 03, 2010

the Day of the iPad

No matter what fate awaits Apple's new product line, you've got to admit that it dominated the moment this morning.

Yes, I was in line too and bought one. Actually, I needn't have waited in line, as I had one reserved and could have picked it up anytime before 3:00 PM, but I wanted to be there, to participate in the launch.

Is it nice? Yes, very nice.

Am I already an expert user? Not hardly. That will take some time, even though I've been using an iPhone almost since the day they first went on sale.

Will it replace either my iPhone or my Mac? No. I can imagine that some future version might replace both, but not this model.

Will I use it in preference to those other devices for some purposes? Most likely, even when I have all three with me, although exactly which purposes remains to be seen. Watching video on a bus strikes me as a slam dunk; the iPad wins that one.

Is there something essentially right about the iPad, which no other device has previously manifested? Potentially. The idea of putting a color touchscreen on a device roughly half the size of a 13" MacBook's screen, split vertically, is brilliant, and the physical design of the iPad is superb, but so far the OS is still a version of iPhone OS, well evolved for the iPhone, but not yet ready to take full advantage of the iPad. That's sure to change; iPhone OS 4.0 will undoubtedly advance that process, and, what Apple hasn't foreseen will quickly be supplied by developers. It won't be long before we begin to really understand what's so special about such a device.

What I expect we'll discover is that the iPad is better suited than any device before it to bridge the gap between the user and the personal constellation, within the universe of information and connection, which they are drawn to explore.

Thursday, April 01, 2010

25 years and still going

The WELL's home page is a little more entertaining today than usual. I suppose such things are bound to happen given that its birthday falls on April Fools Day.

Wednesday, March 17, 2010

add TCO to your TLA vocabulary

TCO = Total Cost of Ownership, and refers to the purchase price plus the cost of support over the lifetime of a device, whether it be a computer or a fleet vehicle.

In a survey of corporate IT managers, the Enterprise Desktop Alliance found they responded that Macs were cheaper to manage far more frequently than the reverse (that PCs are cheaper to manage). The survey was divided into six categories, with the results varying from a low of 31% claiming Macs are cheaper vs. 23% claiming PCs are cheaper, with respect to software licensing fees, to a high of 65% claiming Macs are cheaper vs. 16% claiming PCs are, in the case of time spent troubleshooting.

Results in the other four categories fell somewhere in between, with the cost of supporting infrastructure coming in at 37% lower for Mac vs. 25% lower for PCs, system configuration 50% lower for Macs vs. 25% lower for PCs, user training 48% lower for Macs vs. 16% lower for PCs, and help desk calls 54% lower for Macs vs. 16% lower for PCs.

With such a strong indication that Macs really are cheaper to support, it's not so hard to imagine that they might actually have a lower total cost of ownership than PCs.

If you factor in productivity, those "cheap" PCs may be costing you even more.

37Signals takes a well-aimed shot at Karl Rove

"Courage and Consequence" huh? That must be the courage to do the wrong thing, and consequences and how to avoid them.

Sunday, March 14, 2010

ReadWriteWeb discovers Boulder

It's true what they say, for a relatively laid back town, the population of which swells and shrinks noticeably in step with the University of Colorado's academic calendar, Boulder's got a lot going on.

(Posted from ‘The Goat’ mentioned here.)

intellectual rigor vs. the null hypothesis

In sifting through old bookmarks, I happened on a link to Real Climate, which is currently headlined by this testimonial.

Some who are loathe to accept the notion that mankind's activities are altering the global climate point to lower temperatures over the last few years as proof that global warming is a bunch of hooey. What this specious ‘analysis’ completely neglects to take into account is that the last few years have been an ebb period in the solar cycle. In fact 2008 and 2009 were two of the three lowest years in the last century for the amount of energy received by Earth from the sun, and we should expect that future such ebbs in the solar cycle won't, on average, be as deep as the one we're now emerging from.

And, before you join others in jumping to the conclusion that the sun is cooling off, remember that a century is a very short period in the life cycle of such a star, and that it has been both cooler and warmer in the past.

Friday, March 12, 2010

anticipation reloaded

And so it continues, or begins again. Starting today you can actually commit to the purchase of an Apple iPad, for delivery or pickup on Saturday, April 3rd.

Shall you get the base model, or go for more memory, or one of the 3G models. Should you get AppleCare (2 years for $99)? How about a case? Oh, and you'll be needing a charger.

There's some apps you're sure to want too, but you'll have to wait until you actually have your new iPad in hand to get those. Patience, patience. ;-)

Wednesday, March 10, 2010

can't touch that

In what may be the most insightful view of Apple ever, Gary Hamel's two-part post in his WSJ blog is certainly the best piece I've seen on what is so special about Apple as it is today. My hat's off to him.

Thursday, March 04, 2010

our faltering, disjointed patent system, and how to begin fixing it

John Gruber, of Daring Fireball, waxes long and eloquently about the Apple-HTC lawsuit, quoting Tim Bray at length, and interspersing those passages with his own comments.

My take on the situation is that Apple had to sue somebody, or risk seeing the legitimacy of their patents evaporate for failure to defend them, but even that is evidence of a broken patent system which substitutes litigation for respect.

As previously stated, I believe the protection of the rights of the inventor to be only one of two important principles behind the patent system, the other being the maximization of the rate of accumulation of knowledge and technique in the public domain, for free use by all. Clearly, we have recently erred in the direction of protecting the inventor, providing protection even for creations that should not have been patentable in the first place.

One way to quickly roll this situation back, without having to first wade through the messy detail, would be to shorten the term of all patents, both going forward and retroactively.

I propose ten years from the date of priority be made the expiration date for all protections relating to the ownership of an invention, and that only the right of creative attribution should persist beyond that term, with limits on awards for successful false claim suits.

Combining this with higher standards for the issuing of patents going forward would at least insure that the situation is simplified and dramatically improved over the next ten years.

If this proposal gains traction, you can expect the pharmaceutical industry to cry foul, and I suppose for them you might choose to start the ten year period at some later point, perhaps the date when a new drug is deemed safe for use and no longer experimental.

Wednesday, March 03, 2010

what Joint Venture (TM pending) might and probably won't be

If you, like myself, caught the news of the "Joint Venture" (TM pending) trademark application at a time when you didn't have time to go looking for details, your imagination, like mine, might have run away with you, suggesting all sorts of possibilities. Let's dispense with those first.

Here's some things "Joint Venture" (TM pending) probably won't be about:

  • an intellectual property consortium

  • a research & development consortium

  • a venture capital fund for small businesses using Apple technologies

  • a partnership involving a brand new direction for Apple

What it seems to be about is branding a consolidation and expansion of marketing to and services for small businesses and corporations, possibly in conjunction with and including support for value-added sales/service consultants. If this much turns out to be accurate, then you can also expect Apple to set standards it expects its non-employee representatives to live up to, and probably also standard contracts, with an array of options. I'd expect those contracts to include a clause that allows Apple to directly take over (or reassign) the relationship with the customer if they aren't happy with the quality of service provided by their Joint Venture (TM pending) partners.

What added value might those sales/service consultants provide? Installation and on-site training and service is practically a gimme. Beyond that, one significant possibility is custom programming - building applications, scripts, or simply Automator workflows that are tailored specifically to the customer's needs - and, to this end, Apple might provide some additional building blocks beyond what's already available.

There might also be resources available to encourage smaller business software companies to port their niche-dominating apps to the Mac.

(If this is to scale up very far, they're going to need deprogramming camps to soften up the preconceptions of people who've been exclusively exposed to and conditioned by Windows and Windows applications, to get them to see what they've been identifying by keystrokes and mouse clicks in more general terms and to weaken their notions of what the limits are.)

A bit too abstruse for prime time, perhaps, but potentially a massive project with far-reaching implications.

Saturday, February 27, 2010

D. E. Dilger and friend spoof Mac/PC ads

If you've seen any video of Dilger doing standup comedy, you'll know that he's not always as dry as most of his "Ten Myths of Apple's iPad" videos have been. However, in the 10th installment, he enlists the help of a friend to spice things up a bit, with a result that's quite different. It was intended to be a spoof on the Mac/PC ads, with John Hodgman's character replaced by a slinky iPad running iPhone OS, so that it's Mac OS vs. iPhone OS...

That shared, I must say that it's slightly ironic that Dilger's videos were among the things I couldn't see on the web while I didn't have Flash Player installed, considering some of the things he's said about Flash on the iPhone and iPad. Come on, D.E., let's see an alternative to Flash on your website!

Friday, February 26, 2010

Apple as an electromechanical device company, WTF?

Granted, not obviously more so than other computer companies, but consider not only that hard disk drives and DVD drives are very much electromechanical devices, but also that a broad definition of electromechanical would also include ports, which have both electrical and mechanical force requirements, the cables that connect laptop screens, which must pass through hinges, and the shell and chassis, which must possess sufficient rigidity to prevent damage to the screen and circuit board. Then there's the battery connector, which must maintain absolutely constant electrical connection despite shock, vibration, and corrosion.

So, okay, we're mostly not talking about solenoids and stepper motors, and where we are they're included in major components that come whole from some supplier, but take a closer look.

Apple's designers are no strangers to mechanics. Remember the iMac that hinged like a desk lamp? Did you ever look closely at the hinged arm that connected the base unit of one of those machines to its display? And what about the unibody construction of the aluminum MacBook Air and Pro? Apple didn't just ship the specs off to someone else; they designed the machining process. They also designed their own battery construction process.

Apple is famously more attentive to the physical design of its products than are its competitors, much as it is also more attentive to the electronic components and software, but maybe even more so. Physical design is almost an obsession at Apple, right down to the fit and finish. Perhaps it's a stretch to refer to a Dell laptop as a mechanical device, but to deny that an Apple laptop is one is to fail to appreciate the many hours of sleep lost over issues such as insuring that the magnetic clasp presented just the right amount of resistance to opening, or that the screen would remain in whatever position it was set.

All Apple products, but particularly the laptops, are designed this way, with meticulous attention to the physical characteristics of every component. They are machines in every sense of the word.

Apple, $40 Billion in cash, and extreme patience

As I've already stated, I think Apple will have to get into robotics sooner or later, because that's what I expect consumers to vote for with their dollars, euros, yen, and yuan, once the machines become really useful (a matter of another year or two) and once people come to understand how useful they can be. But that's a conclusion without more than a hint about how I arrived at it.

Apple has a lot of money, and some are saying they should be buying back stock or giving shareholders a dividend. Steve Jobs doesn't think either of those options will significantly effect the price of the stock, and prefers to hold onto the cash until an opportunity to make better use of it presents itself. He's also suggesting that something bigger than previous acquisitions might be on the table.

What could Apple buy that would improve their longterm profitability?

Factories? Maybe, but any such facility would need to be staffed, frequently retooled, and operate more economically that those run by the Chinese manufacturing companies Apple currently contracts with.

A foundry? They require fewer people, but the frequent retooling (to keep up with process technologies) would be a huge expense, and why go to the trouble if contract foundries are producing the quality you need at a reasonable price and respecting your need for secrecy.

Any kind of presence in emerging markets? Well, yeah, but the iPhone is already doing a pretty good job of opening those doors, and getting into the thick of local competition could prove counterproductive. Sprinkling China liberally with Apple Stores is a good approach, and one that will probably also work well in India.

Less than successful technology companies, for their engineering talent? Apple hires the best people they can find, with skills that are well aligned with the needs of the company. Such people are likely to be as rare as hen's teeth in tech companies of no particular relevance, available at bargain basement prices. Still there might be the rare instance when a fortuitous patent or two could sweeten the deal sufficiently.

So, okay, let's get over the idea that Apple is about to go on a random shopping spree, and try to anticipate what sort of deal would look worthwhile to Steve Jobs and Apple's executives and Board of Directors.

It would have to be something that built on what Apple already is, which is a pretty complicated subject in itself. Apple is many things, mostly relating to digital electronics and electromechanical devices (see following post), their programming, their packaging as products, their marketing, and aftermarket sales including music and movies. There's a lot of ways to hook into this framework, and therefore a lot of different types of companies that might make useful additions. An important question is what direction does Apple want to grow faster than it might via individual hires and in-house R&D.

For a smaller company, the employees of which would be absorbed into Apple's existing structure, the culture doesn't matter so much, but if the company to be acquired is large enough that some substantial part of it would essentially be appended to Apple, with employees continuing to work with and report to the same people as before, it would be more important that the company's culture be compatible with Apple's. This is perhaps less of a problem if the company to be acquired is physically remote from Apple's main campus, allowing some breathing space for both.

Application of these filters may reduce the field tremendously, but they still leave many potential acquisitions to pick from, far too many to allow an outsider to predict what Apple will do.

We will just have to wait and see.

Thursday, February 25, 2010

Apple as a mobile/portable device company

In a presentation/interview in the context of the Goldman Sachs Technology & Internet Conference 2010, Apple COO Tim Cook was unusually forthcoming about the company's nature and direction for the future and where their various product lines fit into it.

As part of this, he again characterized the Apple as a mobile/portable devices company. Longtime Mac users may cringe at this, worrying that the Mac is in danger of becoming an orphan. Here are a few points to set your mind at ease.

First, the Mac is doing extremely well in the market, with growing unit shipments and revenue, and dramatically growing market share, particularly on the desktop and above the $1000 price point. This is a business no one in their right mind would walk away from.

Second, the Mac is rapidly gaining acceptance in corporate environments, suggesting the potential for a reliable, longterm market, despite that Apple's attention to this market sector has mainly been limited to resolving technological issues, like Microsoft Exchange support.

But most importantly, Mac OS X and iPhone OS, while not identical twins, are at least full siblings, sharing so much code that, user interface aside, it can be easy to forget there's any real distinction between them at all. They are, in fact, two manifestations of what is fundamentally the same system. As Mr. Cook said at the Goldman Sachs conference, this is a huge advantage for Apple. It means they get more mileage for the development dollar, since new technologies can be applied to more than a single platform with little modification, and also helps to impose the discipline that insures that OS X remains well ordered in its design, minimizing unexpected results from code changes (bugs). Apple is likely to expand the reach of iPhone/OS/X by using it in all of their product lines, helping insure interoperability and further leveraging their development efforts.

So, yeah, Apple is a mobile/portable device company, and a desktop company, and a content delivery company. Don't let their decision to emphasize the mobile/portable aspect make you fret that the Mac in endangered. It isn't.