Saturday, December 31, 2011

progress, and the factors which constrain it

While the timing of the transition from one year to the next is arbitrary, an artifact of the choices made in calendar design, the passing of time is at least a persistent illusion, and the metamorphosis which accompanies it convincing. "The times, they are a-changin'", as a younger version of Bob Dylan once sang. Change has been a frequently recurring factor in the lives of myself and my contemporaries, like a stressed and agitated Earth repeatedly shifting beneath our feet, and much of it at least seemingly not for the better.

I say "at least seemingly" because for any change there will be cascading effects which are difficult to predict at the time, and these cascading effects often interact in surprising ways, so even if you find it hard to believe in "progress" you can still place hope in serendipity.

Having a basic education in biology, the way I see progress is informed by the simple rule of thumb about plant nutrition I once learned, that the growth of a plant is constrained by whatever nutrient is least available (relative to the proportions in which all nutrients are needed). Similarly, progress depends on all of the necessary conditions being in place, or at least acquirable, not just one or two of them, and resources spent on bringing the limiting factor(s) up to snuff yield the most bang for the buck.

So what is/are the limiting factor(s) hindering progress? I can think of a few.

One has been the cost of computation, but it could scarcely be called a limiting factor anymore, even though many applications remain for which the necessary processing capacity continues to be prohibitively expensive, and/or too power hungry. Much that hasn't yet been done could be done within the limits of current technology.

Another is the knowledge and experience to make good use of that computational power. This too is changing, but it's trailing behind the improvement in computing hardware. I'm referring here not only to software but to techniques for interfacing with the physical world, the sensors and actuators of robotics, and the integration of all these into working systems.

Less obviously, but perhaps more importantly, progress has been constrained by what we have (habitually) used these improving technologies to do. To riff on the old saying about when you have a hammer everything looks like a nail, we have, until quite recently, treated every new thing to come along as another kind of hammer, and measured its value in terms of how good it was at driving nails. In other words, we haven't been much interested in changing what we do, only the details of how we do it. I believe people are generally ready to climb out of this rut, if they could rely upon mutual support in doing so.

Progress has also been limited by how we organize ourselves, primarily driven by the conservatively defined interests of capital. There's been a great deal of experimentation with alternative ways of bringing people together to do creative/productive work collaboratively, much of it supported by venture capitalists, but there's still a lot of inertia in the old way of doing things and not yet enough successful counter-examples to point to, or enough general experience with participating in them.

And finally, there is a tremendous need for remedial education in science, technology, engineering, and mathematics, most of which will have to be conducted remotely, via self-instructional packages, video courses, or mass media. I believe this deficit to be the twin product of the counterculture's rebellion against all things technical and a resurgence of the thread of anti-intellectualism that runs through western culture, which can perhaps be traced to Celtic pride in the lack of a written language, but which in any case has been encouraged by those who find a well-informed, clear-thinking populace inconvenient.

Most of us are in a position to work on one or another of these, even if for now it's only to educate ourselves. Let's get to it!

Saturday, December 10, 2011

reforming agriculture through more sophisticated mechanization

Historically, at least since the mechanization of agriculture began in earnest, there have been two primary measures of agricultural productivity, the amount that could be grown on a given acreage and the percentage of the population required to feed all of us. The former, measured in bushels or tons per acre, has generally been increasing and the latter, measured in man-hours per bushel or ton, decreasing for at least the last hundred years, albeit more so for some crops than for others. (A consequence of the decreasing need for labor to produce many staples has been the migration of the children of farmers to cities, where they helped keep the cost of labor low in other enterprises.)

Corn (maize) is a good example of a crop for which these conventional measures of productivity tell a story of brilliant progress, with the result that corn is cheap enough to use not only as livestock feed, to be converted into meat and dairy products, but as the feedstock for production of ethanol for fuel, competing with fuels refined from petroleum pumped from the ground, rather remarkable considering that corn kernels represent only a small fraction of the biomass of a corn plant and that fermentation and distillation aren't particularly efficient processes.

Crops that fair less well by these measures include many vegetables and most fruits, which have been becoming gradually more expensive, especially as compared with grains that are easily handled mechanically, but even compared with meat and dairy products from grain-fed livestock. One major consequence of this has been that people generally consume more grains, meat, and dairy products, and less fruit and vegetables than they once did, before the mechanization juggernaut got started and while vegetable gardens were still common.

So, by an altogether different measure, how healthy the average diet is, mechanization has been a disaster, so far. I say "so far" because the essential problem is that, so far, mechanization has favored crops consisting of hard, dry seeds, that are easily handled in bulk, making other crops needed for a balanced diet relatively less affordable. In happier economic times this would matter less, as people would simply pay the premium for a healthier diet, but the times being what they are people are scrimping however they can, including with the food they consume.

There are other ways of measuring productivity: energy use*, soil gain or loss*, water use and contamination*, and the degree to which a given practice denies space to native flora and habitat to native fauna. By any of these measures, conventional mechanization comes out looking at least shortsighted if not dimwitted.

*(per unit produced)

So is the answer to turn back the clock on agricultural technology, to replace the plow with the hoe and the drill with the planting stick? I'm not prepared to make that argument - although I've no doubt others would - aside from noting that gardens are a better use of many urban spaces than are lawns, and there is no further need for rural communities to supply cities with cheap labor, since those cities are already well supplied, and many rural areas suffer from depopulation.

Instead, my position is that we need to take mechanization to the next level, replacing dumb machines suited only to bulk operations with smart machines capable of performing well-informed, detailed manipulations, for example controlling weeds by selectively pulling them from the ground or pest caterpillars by picking them from plants (unless they've already been parasitized, as by wasps) rather than by applying poisons.

Given machinery with an adequate array of sensors and a sufficiently broad range of optional actions, applying best practices becomes a matter of mating these with processing power connected to an expert system, and of programming.

It gets better, because the same system that works the land can be used to improve the expert system through experimentation and, in routine operation, by accumulation of data to which statistical methods can be applied, and can also be used to improve the crops themselves, as for instance by leaving the best formed, most insect resistant cabbages to go to seed.

The bottom line is that this approach can make available the mechanical equivalent of an attentive expert gardener, at a cost, given predictable economies of scale, that would make possible the wholesale replacement of conventional, traction-based machinery and methods with more adaptable machinery bringing a whole new repertoire of methods to bear, one far better suited to the production of the fruits and vegetables that have been becoming unaffordable under the current regime.

As for the other measures of productivity mentioned above, such machinery, since it wouldn't need to turn soil in bulk and could operate long hours without continuous supervision, would consume energy at a relatively low rate, suitable for supply from solar panels or via the grid from renewable sources. It could operate through continuous ground cover, all but eliminating soil loss, and with minimal use or complete non-use of herbicides and pesticides, reducing soil and water contamination. Ground cover, mulch, and the humus accumulating from decaying roots can also reduce the need for irrigation, and the ability to create local varieties through seed selection based on the health of maturing plants can further reduce it, as well as helping to adapt more quickly to climate change. Making room for native species, something that can only be accomplished in conventional practice by leaving land completely undisturbed, becomes a matter of programming the system to leave certain species alone, wherever it finds them, even to the extent of tolerating some crop loss to native fauna, and to leave anything it can't identify alone until it can be identified.

Such machinery might not be able to compete with conventional practice in the production of corn and other bulk commodities, at least to start with, but it also wouldn't consume prodigious amounts of petroleum-based fuels. Moreover, development and rapid deployment of such machinery would drive the growth of a new, potentially domestic industry, one that would also work to the benefit of materials recycling efforts, more efficient transportation, and on and on.

The R-word I haven't yet mentioned is robotics. While such machines probably aren't what most people first think of when robots are mentioned, their creation and production falls squarely within the discipline of robotics, composed as they would necessarily be from robotic technologies.

Friday, December 09, 2011

monitoring fields with UAVs

They're using radio-controlled aircraft rather than autonomous machines, but it's still a big improvement over the time spent walking fields or the lack of detail that comes from only checking the edges of a field. I expect further improvements with the introduction of better sensors and on-board controllers.

Friday, November 25, 2011

Microsplat: How Microsoft's business could collapse

Business Insider has published an article, STEVE BALLMER'S NIGHTMARE: How Microsoft's Business Actually Could Collapse, outlining several mechanisms which could coincide to cause Microsoft's revenue stream to contract dramatically, likely Microsoft responses to these pressures, and alternative takes on the outcome.

Saturday, October 29, 2011

alternatives to the current economic system, and constraints thereon

Responding to a question posed on LinkedIn, "What's a better alternative to the current global economic system?"

I'm tempted to say no alternative is possible, by which I mean that only incremental change can happen. The system we have is both enormously complex and intolerant of wholesale meddling. In the unlikely chance some fundamentally different system could be agreed upon, with a switchover date, you'd have people dealing in futures based on how long it would be before the alternative would collapse and we'd be back to business as usual.

Even incremental change directed away from the essential nature of the system as it currently exists is quite difficult. The system serves the interests of those able to apply leverage, both economically and in the sphere of public opinion, and fighting this is roughly equivalent to swimming upstream. Nevertheless, there are some things that might be done.

The existence of a malnourished, hopeless underclass is in no one's interest. It saps the spirit of a society and creates an element of instability that occasionally erupts as mob violence. This problem could be eliminated overnight through a guaranteed minimum income, or the equivalent in subsidies for food, housing, clothing, health care, and connectivity, with bonuses for self-improvement, and only a fraction of a dollar taken away for each dollar earned. The cost of this would be relatively small, compared with other ways we spend our money, and also small compared with the consequences of the loss to the economy of so many consumers, whose purchases help drive demand and therefore the profitability of business, and in any case measures of equal scope will become necessary as automation further reduces the percentage of the population that need work to maintain a given standard of living for the society as a whole. Raise the standard of living, and that percentage comes back up, but with constraints; some may need to retrain for two or three years for every year their skills are marketable.

In a world where corporations and individual fortunes transcend national boundaries, but taxation doesn't (except as nations themselves are expected to contribute to international funds), there are many ways to escape paying taxes, and the responsibility to do so has fallen out of fashion. While at this moment it might seem politically unachievable, vesting the power of taxation in some world-wide entity that also transcends national boundaries would help level the playing field, and, for example, diminish the pressure on local authorities to provide incentives that undermine the value of new enterprises located in their districts, and to overlook abusive practices.

Corporate personhood is also due for reexamination. While some of the consequences of this legal fiction make sense to me, corporations having the rights and responsibilities of the ownership of land, buildings, and machines, for instance, others do not. Intellectual property is a gray area for me. On the one hand it makes sense that a corporation ought to derive preferential benefit from research it conducted in-house or funded, while on the other hand it makes sense that the overall benefit would be greater if that research had been conducted according to academic norms of openness, at public expense. I don't believe corporations should be allowed to intervene in any way in the political process, neither directly by officially supporting or opposing parties, candidates, or ballot issues, nor indirectly through PACs, nor by compensating employees or officers who do so on their own time and/or out of their personal funds. On the other hand, I don't believe in the taxation of corporate income. Real estate and other property, yes, even liquid assets, but not income. Taxation on income should be deferred until it becomes the income of some real person, whether through payroll, stock options, or dividends. Regarding taxation of funds earned abroad and repatriated, presumably they've already been taxed by the countries in which they were earned, so it makes sense that they should be taxed here at a reduced rate, if at all.

To some these will sound like radical suggestions. To others they will seem far too tame. Such is the way forward.

Sunday, October 09, 2011

Saturday, October 08, 2011

replacing Steve Job's sense for what people will want

Except for the small percentage of people with fluid imaginations, many of whom are borderline schizophrenic, people can't know what they want until they've seen it, or at least heard it described, or better yet tried it out for themselves. The ability to predict what people would want, and be willing to pay for, was no small part of Steve Jobs's genius, and, in the absence of another individual with that same gift, Apple will need a process that can produce results at least nearly as accurate as Steve's intuition did.

I think Apple has all of the elements from which to build such a process already, and only needs to connect them together. Their engineering and design operations already work closely together, each contributing new ideas. To this they only need to add retail; that's right, the stores, hundreds of them, with personnel in constant contact with Apple's customers.

They can't, of course, send product designs out for retail employees to show to customers. Not only would that approach completely negate the secrecy aspect of the company's culture (largely responsible for its mystique), but the feedback it yielded would be almost worthless.

Instead, they need to simply listen, when customers describe features they'd like to see implemented or products they'd like to see built, and pass along what they hear to a group back at Apple HQ, created for that purpose and closely connected to both engineering and design. That group would sift through the suggestions, recombine them, and pass along the most promising of them to product development managers, who might either initiate official projects or authorize skunkwork projects, depending on how close the idea was to describing a marketable product, meaning one that could be built economically enough using available technology to sell briskly at a customary markup.

Even better would be a structure wherein both design and engineering had representatives in the stores, design representatives on the floor and engineering representatives behind the genius bar. These would probably be retail personnel with special training, who would be called over to listen to customers's ideas while other personnel went on with the ordinary business of the store, and these special representatives in the stores could constitute the pool from which the group doing the sifting back at HQ was drawn, providing not only an advancement path (other than management) for retail but a section of the company which, properly led, would gradually become expert at identifying and describing potential products.

It's probably not necessary to have a pair of such representatives at every store, perhaps only 10%. If you think in terms of the adage about saying "no" to 1000 things to find the one thing worth doing, and expect that each such representative will glean an average of one reasonable idea per week, then 100 such representatives should produce one really good idea, worth pursuing, every 10 weeks or so, or about 5 per year. That might seem like a lot of wheelspinning to get a few good ideas, but good ideas are what keeps a company like Apple healthy, and just one blockbuster product would pay for many years of this approach. Moreover, in the meantime, there'd be thousands of customers who left the store feeling as though someone had really listened to them.

Friday, October 07, 2011

gone too soon, but still not done

Just over a month ago, I wrote (paraphrased) it was inconceivable that Steve Jobs was done. While fate has since robbed him of the pleasure of carrying out his plans personally, that he had plans for the future is certain (corroborated by Eric Schmidt), and that he was well aware he might not be around to see them through is just as certain. Some such plan may be laid out in his will, and there may be some hint of it in his official biography, but given his belief in the necessity of secrecy it's unlikely that the whole plan can be found in any combination of public sources.

That he had the means available to set something significant in motion is also certain, between his personal wealth and the array of people with whom he had strong personal connections. That he had the vision to do so should be apparent from his record at Apple, NeXT, and Pixar.

It's also likely that his plans don't particularly revolve around Apple, not because he'd run out of ideas for the company, but because it became necessary to turn over control of the company to others, and apart from perpetuating the culture that made the company so successful in the first place, he wouldn't want to constrain their freedom to respond to evolving technology and market conditions. Also, his $6 Billion would scarcely make a dent in the prospects for Apple, paling as it does in comparison with the company's cash reserves, but turned in some other direction it could make a huge difference, invested carefully, and still secure the financial future of his family.

So, while I can barely finish writing this through the tears, I'm still expecting something insanely great from the mind of Steve Jobs, perhaps even something that will capture the imagination of millions and change the world more profoundly than anything he lived to carry through himself.

Sunday, October 02, 2011

These grievances are not all-inclusive.

The Declaration of the Occupation of New York City (as edited on 10/1/11) appears below without comment.

Declaration of the Occupation of New York City


As we gather together in solidarity to express a feeling of mass injustice, we must not lose sight of what brought us together. We write so that all people who feel wronged by the corporate forces of the world can know that we are your allies.

As one people, united, we acknowledge the reality: that the future of the human race requires the cooperation of its members; that our system must protect our rights, and upon corruption of that system, it is up to the individuals to protect their own rights, and those of their neighbors; that a democratic government derives its just power from the people, but corporations do not seek consent to extract wealth from the people and the Earth; and that no true democracy is attainable when the process is determined by economic power. We come to you at a time when corporations, which place profit over people, self-interest over justice, and oppression over equality, run our governments. We have peaceably assembled here, as is our right, to let these facts be known.

They have taken our houses through an illegal foreclosure process, despite not having the original mortgage.
They have taken bailouts from taxpayers with impunity, and continue to give Executives exorbitant bonuses.
They have perpetuated inequality and discrimination in the workplace based on age, the color of one’s skin, sex, gender identity and sexual orientation.
They have poisoned the food supply through negligence, and undermined the farming system through monopolization.
They have profited off of the torture, confinement, and cruel treatment of countless animals, and actively hide these practices.
They have continuously sought to strip employees of the right to negotiate for better pay and safer working conditions.
They have held students hostage with tens of thousands of dollars of debt on education, which is itself a human right.
They have consistently outsourced labor and used that outsourcing as leverage to cut workers’ healthcare and pay.
They have influenced the courts to achieve the same rights as people, with none of the culpability or responsibility.
They have spent millions of dollars on legal teams that look for ways to get them out of contracts in regards to health insurance.
They have sold our privacy as a commodity.
They have used the military and police force to prevent freedom of the press. They have deliberately declined to recall faulty products endangering lives in pursuit of profit.
They determine economic policy, despite the catastrophic failures their policies have produced and continue to produce.
They have donated large sums of money to politicians, who are responsible for regulating them.
They continue to block alternate forms of energy to keep us dependent on oil.
They continue to block generic forms of medicine that could save people’s lives or provide relief in order to protect investments that have already turned a substantial profit.
They have purposely covered up oil spills, accidents, faulty bookkeeping, and inactive ingredients in pursuit of profit.
They purposefully keep people misinformed and fearful through their control of the media.
They have accepted private contracts to murder prisoners even when presented with serious doubts about their guilt.
They have perpetuated colonialism at home and abroad. They have participated in the torture and murder of innocent civilians overseas.
They continue to create weapons of mass destruction in order to receive government contracts. *

To the people of the world,

We, the New York City General Assembly occupying Wall Street in Liberty Square, urge you to assert your power.

Exercise your right to peaceably assemble; occupy public space; create a process to address the problems we face, and generate solutions accessible to everyone.

To all communities that take action and form groups in the spirit of direct democracy, we offer support, documentation, and all of the resources at our disposal.

Join us and make your voices heard!

*These grievances are not all-inclusive.

Wednesday, September 28, 2011

monitors that aren't also computers, an endangered species

In an article titled Apple Thunderbolt Display teardown: So many chips it’s hard to believe there’s no computer inside 9to5mac passes along this observation:

iFixit says that both sides of the logic board are packed with so many chips “that it’s hard to believe there’s no computer inside”.

Considering that we've arguably already arrived at the point of saturation, with regard to incremental improvements in computational power producing noticeable improvements in the user experience on displays the size of the Thunderbolt Display, making use of high-end CPUs and GPUs not constrained by power dissipation, and given the inexorable migration of the performance of such high-end chips to low-power, low-cost, highly integrated chips of the sort found in the iPad, how long will it be before it simply makes no sense to build something as complex as the Thunderbolt Display without also making it a computer in its own right? Five years? Ten? (The same logic could be applied to TV tuner hardware.)

Saturday, September 24, 2011

HP needs to rediscover its roots

John Dvorak says Meg Whitman will get nowhere as CEO of HP, and won't last two years in the position.

IMHO, HP needs someone who remembers what HP's strengths once were, when it had some, someone with the patience to attend to detail, someone like my sister-in-law. Let's just call her J.

J worked in inventory control at an HP facility not located in Silicon Valley. By the time she retired, she knew just about all there was to know about inventory control and the software used to manage it. (Not her bragging, but me recognizing the ring of sterling competence on the rare occasion she talks about the work she used to do.)

No, she doesn't know all there is to know about running a company, but she's sensible and, outside of her kitchen, knows how to delegate responsibility. What's more, I'd be willing to bet she'd come out of retirement for a fraction of what they're paying Ms. Whitman.

something from The Wayback Machine

The year was 1997, but what Steve Jobs had to say sounds like it might have been written in 2011… (see video)

Thursday, September 22, 2011

"time's up, pencils down"

Ever wonder how John Gruber got to be as popular a pundit as he's become? His Sept. 21st piece about Apple's fall event, which has in previous years been about music, their iTunes Store and software, and iPods, is a case in point. It's a combination of reasonably good command of the language, plain old common sense, and a light touch of humor, in addition to the occasional tip from an inside source. He's not afraid to admit when he's posting sheer speculation and has a pretty good batting average, but mainly he lays out his reasoning for anyone to inspect who cares to go to the trouble. Agree with him or not, you have to respect his approach.

Wednesday, September 07, 2011

Hoffa suffers brain malfunction

As reported by 9to5Mac, Jim Hoffa, President of the International Brotherhood of Teamsters Union, in an interview on State of the Union, has characterized Apple, Inc. as being unpatriotic, saying:

“Look at Apple, they have $76 billion dollars in their checking account, and they’re not spending it… instead of investing here, everything they do is in China, or in Asia somewhere… There’s something wrong with that.”

He might do better to ask what the institutions in which it is invested are doing with Apple's money, since it's certainly the case that they haven't stuffed it into a mattress.

Monday, September 05, 2011

larger touch screen devices from Apple

The Unofficial Apple Weblog sometimes combines a poll with comments, which, given the nature of their readership, can produce interesting results. Earlier today, Erica Sadun began one such combination, titled You're the Pundit: Will iOS and OS X merge?. Now she has followed that with another, titled You're the Pundit: Are we going to see a touch iMac? Since my AIM password seems not to be working, here is what I'd intended to post there.

By the time the iPhone SDK was released, I had an idea for an app, but it wasn't until the iPad came along, with it's larger screen, that I was able to produce something publishable, and I still felt hemmed in. The full-blown app I have in mind really needs at least a 20-inch screen, and 27 inches would be better, but that screen still needs to be responsive to touch. At such screen sizes, a touch-based operating system that doesn't make allowance for moving on-screen objects around arbitrarily, as on a desktop, and by doing so cause them to perform actions, interact, or be acted upon, will probably seem quaint and hamstrung. Since I fully expect Apple will eventually produce larger touch screen devices, it seems likely that we haven't yet seen all of what they have planned for such an environment, much less all of what they have in mind for gestural computing in general. There may even be a place for windows on larger touch screen devices, but OS X's windowing system would need significant reworking, and I'd expect any window-related APIs to bear the "UI" prefix, so the OS is likely to be called "iOS", even though it will have drawn further inspiration from Mac OS and provide a user experience that's closer to Mac OS than what other iOS devices do. Hopefully, they will come with front-facing stereo cameras, for tracking gestures not involving touching the screen.

Saturday, August 27, 2011

what Steve Jobs has yet to do

Certainly, Steve Jobs could sit on his hands, attend board meetings, show up at the Apple campus occasionally, and otherwise do nothing, for as long as life and breath remain to him. He could, that is, if he were someone else.

But don't expect more of the same from him; others are quite capable of carrying Apple's products and services forward, and Steve's time is too precious for him to be spending it on what others can do (except as he might find dabbling therapeutic).

With unique abilities come unique responsibility, and Steve's abilities are at least a rare combination, if not altogether unique, and are amplified by the tremendous resources his past successes have placed within his reach.

Moreover, he commands the attention of millions; even his offhand remarks are routinely widely distributed.

From where I sit, there's no telling what he will choose to do with all this, but I'm anticipating something insanely great!

Monday, August 15, 2011

the importance of robotics to the achievement of sustainability

I firmly believe that (short of convincing the vast majority of people to return to subsistence farming, something which could only be accomplished through intense coercion) robotics is vitally important to achieving sustainability. This belief so permeates my thinking that it seems necessary to state it explicitly.

I won't be making any arguments in support of this belief today, but just wanted to get it out there, plainly stated.

Wednesday, August 10, 2011

a sleeping dog and a bush

In describing the current state of military robotics, IEEE Spectrum says...

Some of the DOD's most advanced UAVs carry dozens of sensors, including high-resolution night-vision cameras, 3-D imagers, and acoustic arrays. Yet most cannot distinguish a sleeping dog from a bush, even at high noon.

We humans, as participants in the larger economic, social, and political currents of our time, suffer from similar perceptual inadequacy. Many of us fail to understand which among the presumptive alternatives (parties and candidates) comes closest to knowing the way to a better future and intending to lead us there.

More of us fail to comprehend, or forget, that it's up to us, both individually and collectively, to help make that better future possible. Even with the most enlightened of leaders in power, it isn't enough to support that leader's agenda, since such agendas inevitably become bogged down in the struggle to rise above the muck of long-since co-opted politics, forced to compromise away much or most of the content that made them worth supporting in the first place.

But just because compromise is inevitable doesn't mean that we should therefore point to the shadow and call it the light. Our struggle is with those who would, if allowed, take us back to a feudal society, divided between aristocrats and serfs, or something very like it. Not only do they seek to reinstitute classist society, but their effort to do so distracts us from other matters, such as climate change, pollution, the loss of farmland to spreading cities, and the loss of soil to erosion.

We have no choice but to fight both wars at once, to put the devil of aristocracy back into chains and to remake our material culture into something sustainable, able to continue on indefinitely without fouling the planet we all depend upon.

Saturday, July 16, 2011

this is my next dot com

I really don't have much to say about, other than go check it out.

The people behind it are mainly former usual suspects from Engadget, although that's quickly becoming less true as they add new staff. "This is my next ..." is actually a placeholder name for whatever this conglomeration of talent, emotional entanglement, and momentum eventually evolves into.

The most interesting aspect of ThisIsMyNext, as it currently exists, is the weekly podcast, usually produced on Thursdays and available from the website the following morning, and from iTunes sometime after that. It's mainly about mobile devices platforms (iOS, Android, WebOS, etc.) and specific smart phones and tablets, and they tend to avoid talking about Apple too much.

Frankly, except for iOS and except for the implications of Android for robotics, I find it hard to care about the subject of their discussions, but I still love listening. They know what they're talking about, even when they don't agree, and they know each other well enough to be fluid and engaging in the way they go about it.

I wish the same could be said about Congress.

Wednesday, June 29, 2011

is Martin Ford right enough? does it matter?

In evaluating Martin Ford's thesis in The Lights in the Tunnel, the question isn't whether he's entirely right. Rather, to have no point he must be entirely wrong, otherwise the danger remains that he might be right enough, that the process he outlines might in fact result in economic collapse due to a collapse of effective demand (demand combined with purchasing power), brought about by a too many jobs being taken away by automation.

On the other hand, does it really matter, given an economy that seems to depend, for its long-term health, on an infinite supply of land, water, raw materials, and labor (or its mechanical substitute), an infinite market, and an infinite landfill, none of which actually exist? If the collapse of demand doesn't bring it down, something else will.

The need to fundamentally restructure our economic arrangements is looming and unavoidable.

Moreover, by reducing the need for anyone to engage in dangerous or demeaning work, robotics may actually make this transformation easier.

Sunday, June 19, 2011

augmentation: the other side of the robotics coin

Toward the end of the first installment of my response to Martin Ford's The Lights in the Tunnel I said the following:

just as technology enables automation, it also enables augmentation - strength amplification, protection against environmental hazards, heads-up displays providing just-in-time information, enhanced senses, precise manipulation, eye tracking, voice recognition and synthesis, etc. - making what the average human worker is able to perform a moving target

Under pressure from the growing complexity of the aircraft it produces, Boeing has been a pioneer in using virtual reality overlays to provide people possessing general skills with the specific technical information needed to perform the tasks at hand, making it unnecessary for workers to be experts on the systems they build or maintain. That expert knowledge is maintained in a database and served to the worker just when it's needed. (Such an approach could also enable farmers to maintain robotic machinery with which they had no prior experience.)

DARPA has demonstrated keen interest in providing soldiers with wearable equipment that can enhance their strength and stamina, allowing them to carry more weight further, faster, over terrain too difficult for wheeled vehicles. They have also funded fully robotic solutions, but so far that augmentative approach looks more promising.

These examples combine nicely. A factory or maintenance facility worker with a powersuit would be able to handle heavier parts without the need for hoists, and a soldier with a heads-up display would be less likely to get lost, or to waste time and effort on inefficient paths.

Telepresence and teleoperation make it possible for human workers to be on the scene, instantly, when needed. Using the example of an automated transportation system this could mean welcoming passengers and verifying that they and their belongings are entirely inside the vehicle before closing the door, ascertaining a destination, operating active components to secure assistive devices, checking whether problems develop en route, insuring passenger security at the destination before opening the door, checking whether the vehicle needs to be cleaned or repaired before being used again, and actually directing the vehicle anywhere it needed to go outside of the track/guideway system, with the aid of onboard sensors and intelligence. For each of these functions there might be an automatic mode, with a human operator monitoring in questionable circumstances and intervening whenever the automatic mode proved inadequate, when experience suggested that it would be likely to do so, or when a particular passenger had indicated a preference for dealing with a human operator and time permitted.

When a human is part of the solution, you get a highly evolved brain and basic senses in the bargain. Ford makes the point that, for many jobs, what a human brings to the table is more than is needed, and that providing technical analogs for just the portion that is needed is commonly either already possible or within reach. While I grant the truth of this, I also want to point out that within reach and affordable are far from being the same thing, and that just because you can replace a person with a machine in a particular circumstance doesn't mean that doing so constitutes a reasonable business decision. Moreoever, the 'excess' capacity of a human worker may be just what's needed to prevent an anomalous situation from turning into a disaster, saving the company far more than the difference between wages and benefits and the cost of ownership and operation of some replacement machine.

Because technical augmentation tends to move human workers from mind-numbing work into positions where they are both more stimulated and have a higher level view of the overall operation, it also pays off in terms of developing experience in those workers as individuals and in the workforce as a whole.

Aside from the simplest repetitive tasks, the return on the investment dollar for technology to enhance a human worker's capabilities is very likely to be both greater and more immediate than the return on investment for the more sophisticated technology needed to actually replace that worker.

Wednesday, June 15, 2011

where businesses are putting their money

The post Man vs. Machine on the NYTimes blog Economix is closely related to my ongoing discussion of Martin Ford's The Lights in the Tunnel, and lends support to his contention that economists are wrong in their off-handed dismissal of the possibility that automation/AI/robotics may produce permanently high unemployment, resulting in the collapse of the economy, unless something is done to preserve consumer spending power despite unemployment.

The comparison between change in spending on equipment and software versus change in spending on payroll and benefits is particularly telling. Employee compensation is growing, but at about one tenth the pace.

Sunday, June 12, 2011

agricultural robotics and employment

At least with regard to agriculture, the effect of robotics upon employment depends on the approach taken. If your goal is to further reduce the number of people deriving an income from farming, and you are willing to accept any other sort of expense to that end (autonomous tractors for instance), then you can probably manage to reduce the percentage of the workforce engaged in agricultural production to an even smaller fraction of 1%.

If your goal is to maximize the production of those crops that are easily produced and handled in bulk and survive long-term storage well, in the interest of generating return on capital investment and foreign exchange, and only care about how it's done insofar as that impacts the bottom line, you might conclude that capital expenditures to further minimize payroll would generally not be cost effective, that it would cost more to replace the remaining workforce than to keep it.

However, if you're interested in guaranteeing the sustainability of production far into the future, despite climate change, while also halting soil loss, ending the use of poisons, preserving remaining diversity in both crop and native genomes, and rebalancing production for healthier diets, you may need both more sophisticated machinery and all the people you can recruit.

Such a complicated goal implies complex operations, and complex operations imply a large variety of tasks, some easily mechanized and others common enough to make mechanization worthwhile, even though challenging. Those that are neither common nor easily mechanized will fall to human workers, farmers and farmhands, who are far more adaptable than any machine.

At some point in the future it may become possible to build machines adaptable enough to take the place of a farmer, but until the annual cost of ownership of such a machine drops below the annual cost of one human worker, it won't make economic sense to deploy them, and without an infrastructure to drive down the cost of robotics, that may never happen.

Cross-posted from my Cultibotics blog.

Monday, May 30, 2011

Ford's Lights in the Tunnel, early quibbles

I haven't read far enough through the book yet to know whether Ford actually stands behind these positions, or has simply propped them up as straw men.

In the Introduction, on page 5, he writes The disintegration of the Soviet Union in 1991 demonstrated quite conclusively that there is no good alternative to the free market system. Perhaps he was attempting to ingratiate himself to economists in saying this, or attempting to preempt argument based on the soviet example, or maybe he truly believes it. Whatever his purpose, the statement by itself is a blatant non sequitur. The Soviet Union was a single example of an alternative, or maybe a class of alternative examples, since they tried just about every permutation of their own model at one time or another (some of which worked rather well in microeconomic terms, by the way). But their ideology-driven model severely constrained what experiments were possible and even more so which could be given sufficient latitude for a fair test.

Then, in Chapter 1, in the Automation Comes to the Tunnel thought experiment beginning on page 17, he discusses temporarily increased profits deriving from reduced costs made possible by automation, but he completely neglects the secondary effect of growth and jobs created in the automation/robotics industry, in design, customization, testing, sales, production, shipping, installation, maintenance, programming, and retooling. These may not add up to the jobs replaced by machines in other industries, but it's too large a factor to be ignored.

Now, back to the Introduction, page 2, for a consideration of the following. Put yourself in the position of a business owner and think of all the problems that are associated with human employees: vacation, safety rules, sick time, payroll taxes, poor performance...maternity leave. If an affordable machine can do nearly any routine job as well as a human worker, then what business manager in his or her right mind would hire a worker? This turns on the word affordable which at best comes down to a projection, made by a CFO, based on incomplete information, regarding whether the business will profit more from keeping its workers or from replacing some of them with machines, and it's not as simple as comparing the cost of a machine with the annual cost of the workers it could replace multiplied by the machine's estimated useful life. People are more adaptable, and can move from one task to another with a minimum of fuss, whereas a machine would at least need to be reprogrammed (or retrained) for each new task, and might even prove useless in the new circumstances; the more specialized the machine the less likely it is to be able to adapt. Also, just as technology enables automation, it also enables augmentation - strength amplification, protection against environmental hazards, heads-up displays providing just-in-time information, enhanced senses, precise manipulation, eye tracking, voice recognition and synthesis, etc. - making what the average human worker is able to perform a moving target.

The situation would seem to be less bleak than Ford's first pass through the tunnel suggests.

Sunday, May 29, 2011

wishing for the Janus (times 2 or 3) online locus

I've recently been making more use of Twitter, Facebook, and LinkedIn, somewhat at the expense of participation on The WELL, but not entirely so. Each of these has something to offer, and leaves something to be wished for. I've also ramped up my use of RSS (until I became overwhelmed and had to shut it back down), and have three blogs (including this one), a couple of homepages, one dormant, and a couple of dormant domain names.

The blogs are all on Google's Blogger, so that's a single identity, and the active homepage is on The WELL, so that combines with my participation there to form another identity. RSS, the dormant homepage, and the domain names don't really count, for now, but that still leaves me with FIVE online identities, without including accounts on the systems of companies with which I do business.

Meanwhile there's a herd of other social networking sites wanting a piece of that pie, and more joining the melee all the time. It leaves me wondering what they could possibly be thinking, given the time and mental effort participants in existing sites have already invested, and amazed at the numbers reported by the more successful of the newcomers.

But I don't want more places to spread myself across. I want a single service that allows me to present my various aspects as parts of a single whole, allowing me to selectively expose some or all on a per-contact or per-group basis, and which allows me to make finer distinctions regarding sources of input than follow, like, or connect.

As for RSS, it's not the particular feed but the entity behind it, the specific organization, program, university department, startup company, or corporation, that I'm interested in and want to track. Not all of these publish the news I'd like to know about as RSS feeds; some publish press releases to mailing lists, and some merely update their websites. Some even publish news as YouTube videos. I'd like to be able to combine all such sources into a single interface, keeping the extraneous noise to a bare minimum.

So, while new systems competing for your attention, bringing new themes and new variations on the old ones, may help to build out the possibilities of online networking and information distribution, I look forward to the day when these upstarts have combined to form a smaller number of more complete systems, been acquired, or themselves swallowed one of the whales of social networking.

PS, I completely forgot about my Yahoo!/Flickr account, which adds two Yahoo! groups and a Flickr photostream, and a sixth online identity!

Wednesday, May 25, 2011

confidence is a perishable commodity

Lodsys had best bite their tounges, before they inflict demonstrable damage on the perishable commodity that is the confidence of Apple's developers. Otherwise they may find their patents are worth less than the damage they've done, and are therefore forfeit.

Sunday, May 22, 2011

in search of the way forward

An old acquaintance suggested I check out Martin Ford's "The Lights in the Tunnel" which I'm in the process of doing.

The fast, first-pass take is a little scary. It seems to be about how the economy is falling apart because too many people have no purchasing power, because their jobs have been shipped offshore, automated, or both.

Ford has some suggestions about how to deal with this and I've had some thoughts along these lines myself, so I anticipate using his writings to reenergize and hone my own thinking and sharing the result of that process here.

One tentative conclusion I'd reached just shortly before hearing about "The Lights in the Tunnel" was that, generally speaking, when robotics is applied to bringing a better approach to bear to some task (doing things in progressively greater detail, taking more and more into account) the result is usually a net gain in employment. I'm not sure this is generally true, but I'm nearly certain it's true in circumstances where people have already been all but completely replaced by machines, such as is the case in modern agriculture, where humans have mostly been relegated to the role of machine operator, serving as the control unit that it hadn't until quite recently been possible to build.

The application of robotics to the conduct of horticulture on an agricultural scale is a longtime theme for me; I have another blog on that subject, so chances are I'll be returning to that example from time to time, but, as Ford is at some pains to point out, this is an issue which transcends any category of economic activity.

It is clear at the outset that it is the inertia of our socioeconomic arrangements that threatens a crisis in response to the liberation being made possible by emerging technologies, and, with another set of such arrangements we don't yet know to name, we might welcome that liberation as a godsend.

More to follow...

Saturday, May 21, 2011

in garb appropriate to the slaying of trolls

I just had a humorous thought. When WWDC 2011 rolls around, a little over two weeks from now, Steve Jobs takes the stage in full armor and carrying a great sword (all fashioned from aluminum for the occasion), which is to say in garb appropriate to the slaying of trolls.

Sunday, May 01, 2011

plea to GOP

Please, please, give us a presidential candidate whose candidacy can be conducive to constructive debate.

In case you're wondering, IMHO that would exclude Donald Trump.

Wednesday, April 20, 2011

FCC Chairman Genachomski Interviewed by TechCrunch

TechCrunch's Jason Kincaid talks with FCC Chairman Julius Genachowski. If you want to know what drove the creation of the rules currently being batted around by Congress, watch this video.

Saturday, March 26, 2011

Kodak may be in for a surprise

Kodak is saying that, should Apple and RIM fail to settle, they may be liable for $1 Billion.

Kodak's market cap is slightly lower than that.

Why should Apple and RIM fork out $1 Billion in damages when they could buy the company for about the same amount?

Monday, March 07, 2011

Scoble Tours SRI and Gets Scoop on Siri

still from
Robert Scoble, a frequent contributor to Rackspace's Building 43 recently toured SRI and conducted interviews with senior staff. Videos of those interviews have just been published on TechCrunch. A related, long inteview, about the spin-off and subsequent acquisition by Apple of Siri, creators of the iPhone app by the same name — with Norman D. Winarsky, Vice President of Ventures, Licensing and Strategic Programs at SRI, Gary J. Morgenthaler, General Partner at Morgenthaler Ventures, and Shawn T. Carolan, Managing Director at Menlo Ventures — is linked from the TechCrunch article in two parts (Part I, Part II). There is a notable quote in Part II (beginning at 8min 30sec): Think of the world's best AI technology in the hands of the world's greatest consumer electronics company. While the AI technology Apple acquired with Siri may represent a competitive advantage, it also represents the state of the art, and suggests that AI of the sort used by Siri is ready for prime time.

Thursday, February 24, 2011

Lion Server to be integrated (included) with Lion

Apple frequently opts for non-alternatives, that is for options the pundits hadn't even considered.

Such is the case with the inclusion of Lion Server in every copy of Lion! Just for starters, this means that every Mac running Lion will put Wiki Server at the user's fingertips, and very likely make the code behind it available to developers.

Just how much of the current, distinct server version will carry over to be included with Lion remains to be seen. Apple may decide to put some components representing large investments on the App Store, but they'll run on the stock version of Lion and be a cinch to install.

I can just see the ad now, Mac OS X Lion, the world's most complete operating system.

Tuesday, January 18, 2011

replacing the iTunes app with a web app

9to5mac reports rumors that Apple plans to replace the iTunes app with a Safari-only web app.

Before commenting on how credible this might be, let me suggest a little experiment. Fire up iTunes and point it to the iPhone/iPod/iPad app store, also fire up the Mac app store application, then do a few parallel searches for apps that exist on both platforms. Note how the performance of the iTunes app compares with that of the Mac app store application.

Apple has been building out a web version of iTunes for some time. Take the Twitter app, for instance. If you search for Twitter, in the iOS app store using the iTunes app, and click on the bird, nothing will happen because you're already there, but if you right-click (control-click) on the bird and choose copy link, then paste that link into Safari and activate it, instead of being taken back to the iTunes app you're instead taken to a look-alike web page.

Given that Apple has a contractual obligation to constrain the installation of non-free apps to devices owned by people who've paid for those apps, it makes sense that they would limit access to a web version of the iTunes store to Safari, which they can control, possibly using a plug-in for this purpose. They might also use Safari only to conduct transactions with the store and use a separate program to manage the configuration of various devices.

Whatever the details, there's probably some truth to this rumor.

Friday, January 14, 2011

what I'm talking about

After years of anticipation, here it is, the brush you can use to paint on a screen!

Now, imagine using one of these on a screen that has touch sensitivity in each and every pixel, instead of the currently lower-resolution grid!

Sunday, January 09, 2011

iRobot's AVA uses iPad for controller and telepresence interface

If you can program an iPad (or Android tablet) you can program iRobot's AVA.

Engadget originally characterized AVA as a telepresence machine, using an iPad sitting on a mount that incorporates a camera as the interface, but the AVA will be a general purpose platform complete with SDK, which, for the iPad, will be compatible with Xcode and the iOS SDK.

Great move, iRobot!

Saturday, January 08, 2011

if you've already paid for Pixelmator, buy it again now!

If you have Pixelmator 1.6.3 installed on your Mac, don't be fooled if running "Check Now" in Preferences => Updates tells you that you are running the latest version. 1.6.4 is waiting for you on the Mac App Store!

What? Pay for it again? Well, as an email sent to licensed users says "You can download Pixelmator on the Mac App Store for just $29, for a limited time. By transitioning to the Mac App Store, you will get the totally awesome Pixelmator 2.0 (and, of course, still lots of 1.X updates) for free once it is out in the Mac App Store later this year."

So think of it as a prepaid upgrade. True, those who hadn't yet paid for the program will get the same deal, but at least they'll be joining you in helping to underwrite one of the finer pieces of software in existence.

If you wait, the price will go up, so act now.

Friday, January 07, 2011

24 hours and a million downloads later

One of the most interesting aspects of the Mac App Store is how smoothly the application used to access it works. My guess is that it's based on WebKit and that the store itself uses SproutCore or something like it. In any event, it's a considerably more pleasant experience than using the iTunes app to access the iOS app store.

Should help sales!

Tuesday, January 04, 2011

what I'm talking about

In a previous post I discussed the possibility of using an iPad as a Mac peripheral, and stated that I believed there were already iPad apps which made this possible, speculating that they probably paired with specific Mac apps.

Now, Macworld has found one that augments the Mac's Finder. Remote Conductor comes in two parts, the iPad app and a free Mac server application with which the iPad app cooperates, communicating via wifi. With this setup established, Remote Conductor offers three modes, one which turns the iPad into a large, multitouch trackpad, another which turns it into something resembling a grid display of your Applications folder from the Dock, and the third allows you to navigate through open windows by application (scrolling left/right) and by open windows belonging to an application (scrolling up/down).

I didn't see a price listed anywhere on the company's website, but a quick trip to the App Store shows that it sells for $9.99, which is a little steep, unless you'll be using it a lot, which, if it works as advertised, I'm sure many will. (Oh, and they're working on a Windows version of the server software.)

Very interesting, but not quite what I was talking about. What I'd really like to see would be a more general iPad app (or mode) that works with either a server or, preferably, with any app that includes and makes use of a framework for remotely controlling the iPad's display and directing input received from it. This is the easier option from a developer's standpoint, since making use of a separate server program would involve making use of services it presented by sending script-level messages to it, much as Automator does, something fewer developers know how to do.

Even better would be to do both. Since at least a minimal server process would be necessary, it might as well be owned by the operating system, and present applications with a choice of either passing low level data, generated by and targeted to code compiled using a framework designed for this purpose, or else interacting with the iPad via scripting. Those scripting hooks could actually be included in Automator, making them accessible to a much larger audience.

For security reasons, the part of this combo running on the iPad should be sandboxed at least to the same extent as other iPad apps, so that it had no more access than they do to data stored on the iPad. Because Mac apps aren't necessarily subjected to review, it might be better if it were even more isolated, as a service provided by iOS running in a special, locked-down mode. It could be enabled via a second button, like the one on the lock screen that puts the iPad into slideshow mode, and interrupted at any time by pressing the Home button (or as determined in Settings).

However the implementation were to be handled, the end result should be to make it possible for a Mac app to run part of itself on an iPad, taking advantage of not only the iPad's touchscreen but also the lion's share of its cpu/gpu time and RAM, as as much nonvolatile memory as any app is allowed, with Mac UI elements compiled using AppKit and iPad UI elements compiled using UIKit.

Unless and until Apple decides to make such a framework (and/or script server) available, it's still possible, right now, to create iOS apps which communicate and cooperate with Mac (Windows, etc.) applications, providing those applications with what amounts to a smart touchscreen peripheral.

If you're having trouble imagining how this might be useful, imagine an emergency call center with a highly integrated computing environment, with operators sitting at workstations and the shift supervisor walking around behind them carrying an iPad that's displaying an overview of all the traffic passing through the call center, with additional information (the history of calls from a particular number) just a quick tap away.

Sunday, January 02, 2011

here comes the Mac App Store

Opening January 6th.

I might have added "finally" or "at long last" to the title.

Sure, it's an idea that would have been very difficult to implement well just a few short years ago, before the iOS app store illuminated the territory, and in that sense its arrival is timely, but it fills a niche which has lain empty, potential yet gnawingly empty, at least since the ascendancy of the Internet. (There have been attempts to create a common marketplace for Mac apps, but all have fallen far short of what only Apple was ever in a position to do right, by integrating it into the system software.)

For the independent software developer, it's a godsend! Suddenly they'll have the means to make their wares widely available with little effort beyond that involved in crafting them well in the first place, and with little friction to prevent customers from making the decision to buy.

For users it's an answer to prayer! As the vendors of the third-party programs they use move (or expand) their distribution into the Mac App Store, users will gain a one-stop shop for updates. They'll also gain a simple, trustworthy process for buying new apps, and no more need to assume whatever risk there might be in using PayPal or other, similar services, and Apple's review process will help protect them from malware and poorly written programs.

Most likely, the stated price of apps (not what you can get in bargain basement combo deals) will come down on average, and, between the convenience and safety of the store itself and the better prices, the market will respond with an abrupt increase in sales volume. With a larger market, more effort will be devoted to developing Mac apps, and those apps will, for the most part, be gathered together in one place where the customer can browse through them looking for a best fit for their circumstances. Everbody wins!

What a heartening development!

Saturday, January 01, 2011

Apple in 2011 and beyond

Jonny Evans, writing in his Computerworld blog, has already issued his speculations for what might be forthcoming from Apple during 2011. I won't be going through that list, just pointing you to it.

The way I see it, there's several major trends to be tracked in news from Apple: the development of iOS, Mac OS X, and the cross-fertilization between them; the parallel development of iDevice and Mac hardware; the movement towards cloud computing; and any indication that Apple is transitioning towards increased self-reliance with regard to essential components. There's also the collection of secondary devices and peripherals, which may take on greater importance in the future as it becomes possible to shoehorn a complete system into smaller and smaller boxes.

Of these, I only intend to address Apple's movement towards self-reliance in essential components, branching out some from that starting point.

First, I don't mean to suggest that Apple is about to build or acquire an IC foundry, except perhaps a small-scale one sufficient to allow them to keep their IC designs in-house until ready for mass production. As the scale of circuit features has diminished, the cost of the equipment needed to fabricate chips has gone through the roof, and only very large scale operations are economically viable. This means a handful of foundry operators selling production line time, and that situation isn't likely to change soon. If Apple were to build or acquire a foundry, they would have to farm out production time when their own needs were slack, and buy additional production time from others when demand for their products grew faster than expected, meaning that it would be at best a marginal advantage and probably not enough of an advantage to justify the investment.

The tools for chip design, on the other hand, have become more affordable, and a cottage industry of small design houses has blossomed as a result. Many of these companies are involved in producing application-specific integrated circuits (ASICs), frequently combining only a small amount of custom circuitry with cores and subsystems licensed from others.

Apple has long been in the ASIC design business, incorporating custom chips into their product designs, but they have recently been bolstering this capability, in part through the acquisitions of Palo Alto Semiconductor and Intrinsity, and, at least for iOS devices, they have moved to custom CPU designs incorporating ARM cores. We can expect more of this, and it wouldn't be terribly surprising if the MacBook Air were to be switched to an ARM-based design.

Apple already has the technology to cross-compile software between CPU instruction sets, having done this first for MC68000 code running on PPC machines, and then for PPC code on Intel machines. Moreover, in the transition from PPC to Intel, they did some work on using an intermediate instruction set for an abstract virtual machine as part of process of translating code from one platform to another, and they have continued as an active participant in the LLVM project, incorporating that technology into Xcode.

That doesn't necessarily means they could deploy an ARM-based machine running Mac OS X tomorrow, but chances are they've been working quietly on exactly that; the writing having been on the wall since they determined that the Intel Atom wasn't competitive for use in portable devices. And, while they could use an off-the-shelf CPU in such a machine, it's nearly certain they'll instead opt for an in-house custom design. This would allow them to, for example, incorporate a graphics core that's optimized for both OpenGL and OpenCL. (They might even bring back the Velocity Engine, in a fresh, multicore implementation.)

But an ARM-based MacBook Air shouldn't be taken as an indication that Apple was about to abandon Intel's processors. While the low-end (plastic) MacBook might also make the switch, it's unlikely that the MacBook Pro line would follow, at least not immediately, and even less likely that the desktop line would drop Intel. Apple has long upheld the principle that software should be hardware-independent, both allowing maximum flexibility in the selection of hardware and helping to insure that changes in the hardware don't break the software. Simultaneously shipping ARM-based light-weight portables and Intel-based desktop and professional machines - that run the same software without the need for fat binaries, because it's all compiled for the LLVM virtual machine - would drive that point home.

So, three predictions: first, expect Mac OS X 10.7 (Lion) to have a legacy environment, for software that assumes the Intel CPU architecture, and a native mode for software that's been (re)compiled with the LLVM virtual machine as the target architecture; second, expect the Mac App store to cease accepting new Intel binaries within a year of the release of Lion, perhaps as little as six months; and third, expect an LLVM-binary compatibility environment in Snow Leopard before they cease revising it (analogous to the inclusion of Carbon in Mac OS 8 and 9).

Back to you, Jonny.